datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
recoilme/colorful_m | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3652615345.04
num_examples: 2490
download_size: 3640789678
dataset_size: 3652615345.04
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qq912479431/chinese_video | ---
task_categories:
- text-generation
--- |
speech31/zeroth_korean_ipa | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: phonetic_codes
dtype: string
- name: ipa
dtype: string
splits:
- name: train
num_bytes: 2821876699.925
num_examples: 22263
- name: test
num_bytes: 60098108.0
num_examples: 457
download_size: 2882743112
dataset_size: 2881974807.925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Tsuinzues/gosma | ---
license: openrail
---
|
anan-2024/twitter_dataset_1713003563 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 414864
num_examples: 1071
download_size: 216102
dataset_size: 414864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Matheus30cs/barakaMK | ---
license: openrail
---
|
kheopss/dspy_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: answer
dtype: string
- name: gold_titles
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 295137.5738103508
num_examples: 575
- name: test
num_bytes: 1182603.4261896492
num_examples: 2304
- name: dev
num_bytes: 1182603.4261896492
num_examples: 2304
download_size: 1345736
dataset_size: 2660344.426189649
---
# Dataset Card for "dspy_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_prithivida__Asimov-7B-v1 | ---
pretty_name: Evaluation run of prithivida/Asimov-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [prithivida/Asimov-7B-v1](https://huggingface.co/prithivida/Asimov-7B-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_prithivida__Asimov-7B-v1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T10:40:19.617701](https://huggingface.co/datasets/open-llm-leaderboard/details_prithivida__Asimov-7B-v1_public/blob/main/results_2023-11-19T10-40-19.617701.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5591327615228756,\n\
\ \"acc_stderr\": 0.033793433613618404,\n \"acc_norm\": 0.5679554479286705,\n\
\ \"acc_norm_stderr\": 0.034576211690701054,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5114755425083032,\n\
\ \"mc2_stderr\": 0.015500857240755488,\n \"em\": 0.004928691275167785,\n\
\ \"em_stderr\": 0.0007171872517059772,\n \"f1\": 0.06691170302013415,\n\
\ \"f1_stderr\": 0.0015363127511980274\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171864,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.01437035863247244\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6097390957976498,\n\
\ \"acc_stderr\": 0.004868117598481945,\n \"acc_norm\": 0.8004381597291377,\n\
\ \"acc_norm_stderr\": 0.00398854190214743\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286634,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.02507598176760168,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.02507598176760168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164528,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934837,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934837\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7431192660550459,\n \"acc_stderr\": 0.018732492928342472,\n \"\
acc_norm\": 0.7431192660550459,\n \"acc_norm_stderr\": 0.018732492928342472\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864595,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864595\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598645,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598645\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808848,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808848\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415005,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415005\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3852672750977836,\n\
\ \"acc_stderr\": 0.012429485434955194,\n \"acc_norm\": 0.3852672750977836,\n\
\ \"acc_norm_stderr\": 0.012429485434955194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.020036393768352638,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.020036393768352638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5114755425083032,\n\
\ \"mc2_stderr\": 0.015500857240755488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998297\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.004928691275167785,\n \
\ \"em_stderr\": 0.0007171872517059772,\n \"f1\": 0.06691170302013415,\n\
\ \"f1_stderr\": 0.0015363127511980274\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0932524639878696,\n \"acc_stderr\": 0.00800968883832857\n\
\ }\n}\n```"
repo_url: https://huggingface.co/prithivida/Asimov-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|drop|3_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-40-19.617701.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-40-19.617701.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- '**/details_harness|winogrande|5_2023-11-19T10-40-19.617701.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T10-40-19.617701.parquet'
- config_name: results
data_files:
- split: 2023_11_19T10_40_19.617701
path:
- results_2023-11-19T10-40-19.617701.parquet
- split: latest
path:
- results_2023-11-19T10-40-19.617701.parquet
---
# Dataset Card for Evaluation run of prithivida/Asimov-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/prithivida/Asimov-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [prithivida/Asimov-7B-v1](https://huggingface.co/prithivida/Asimov-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_prithivida__Asimov-7B-v1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T10:40:19.617701](https://huggingface.co/datasets/open-llm-leaderboard/details_prithivida__Asimov-7B-v1_public/blob/main/results_2023-11-19T10-40-19.617701.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5591327615228756,
"acc_stderr": 0.033793433613618404,
"acc_norm": 0.5679554479286705,
"acc_norm_stderr": 0.034576211690701054,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5114755425083032,
"mc2_stderr": 0.015500857240755488,
"em": 0.004928691275167785,
"em_stderr": 0.0007171872517059772,
"f1": 0.06691170302013415,
"f1_stderr": 0.0015363127511980274
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171864,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.01437035863247244
},
"harness|hellaswag|10": {
"acc": 0.6097390957976498,
"acc_stderr": 0.004868117598481945,
"acc_norm": 0.8004381597291377,
"acc_norm_stderr": 0.00398854190214743
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286634,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.02507598176760168,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.02507598176760168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164528,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934837,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7431192660550459,
"acc_stderr": 0.018732492928342472,
"acc_norm": 0.7431192660550459,
"acc_norm_stderr": 0.018732492928342472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864595,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864595
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598645,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598645
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808848,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808848
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415005,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415005
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970473,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3852672750977836,
"acc_stderr": 0.012429485434955194,
"acc_norm": 0.3852672750977836,
"acc_norm_stderr": 0.012429485434955194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.020036393768352638,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.020036393768352638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5114755425083032,
"mc2_stderr": 0.015500857240755488
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998297
},
"harness|drop|3": {
"em": 0.004928691275167785,
"em_stderr": 0.0007171872517059772,
"f1": 0.06691170302013415,
"f1_stderr": 0.0015363127511980274
},
"harness|gsm8k|5": {
"acc": 0.0932524639878696,
"acc_stderr": 0.00800968883832857
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Aunsiels/Ascent-GenT | ---
license: mit
task_categories:
- question-answering
language:
- en
pretty_name: Ascent-GenT
--- |
dotan1111/MSA-amino-4-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
sravaniayyagari/apr2 | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 3083890
num_examples: 1713
- name: validation
num_bytes: 326379
num_examples: 189
download_size: 433660
dataset_size: 3410269
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
razsoriginals/shiachat | ---
license: apache-2.0
---
|
virtualvoidsteve/code_correction_dataset_2150 | ---
dataset_info:
features:
- name: corrupted
dtype: string
- name: corrected
dtype: string
splits:
- name: train
num_bytes: 95006
num_examples: 114
download_size: 30646
dataset_size: 95006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amaandhada/easter-egg | ---
license: apache-2.0
---
|
sahilkadge/medical_audio_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': dev
'1': test
'2': train
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 39043219.0
num_examples: 49
- name: validation
num_bytes: 980847.0
num_examples: 1
- name: test
num_bytes: 5066563.0
num_examples: 7
download_size: 44985291
dataset_size: 45090629.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-36c277-93197145789 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-large
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-large
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sasha](https://huggingface.co/sasha) for evaluating this model. |
SAGI-1/Greetings_DPO_dataset_V1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1204169
num_examples: 891
download_size: 453426
dataset_size: 1204169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nc33/triplet_sbert_law | ---
license: mit
---
|
gagan3012/dolphin-retrival-LAREQA-QA-qrels | ---
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int32
splits:
- name: test
num_bytes: 7378
num_examples: 119
download_size: 3375
dataset_size: 7378
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Kamaljp/comments_classed | ---
dataset_info:
features:
- name: top_comment
dtype: string
- name: reply
dtype: string
- name: top_author
dtype: string
- name: reply_author
dtype: string
- name: comment_date
dtype: string
- name: comment_time
dtype: string
- name: reply_date
dtype: string
- name: reply_time
dtype: string
- name: reply_class
dtype: string
- name: comment_class
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1805186
num_examples: 4424
download_size: 894372
dataset_size: 1805186
---
# Dataset Card for "comments_classed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AISE-TUDelft/nlbse_ccc | ---
configs:
- config_name: default
data_files:
- split: java_Pointer
path: data/java_Pointer-*
- split: java_Expand
path: data/java_Expand-*
- split: java_Ownership
path: data/java_Ownership-*
- split: java_deprecation
path: data/java_deprecation-*
- split: java_rational
path: data/java_rational-*
- split: java_summary
path: data/java_summary-*
- split: java_usage
path: data/java_usage-*
- split: python_Expand
path: data/python_Expand-*
- split: python_Summary
path: data/python_Summary-*
- split: python_DevelopmentNotes
path: data/python_DevelopmentNotes-*
- split: python_Parameters
path: data/python_Parameters-*
- split: python_Usage
path: data/python_Usage-*
- split: pharo_Example
path: data/pharo_Example-*
- split: pharo_Keymessages
path: data/pharo_Keymessages-*
- split: pharo_Responsibilities
path: data/pharo_Responsibilities-*
- split: pharo_Keyimplementationpoints
path: data/pharo_Keyimplementationpoints-*
- split: pharo_Collaborators
path: data/pharo_Collaborators-*
- split: pharo_Intent
path: data/pharo_Intent-*
- split: pharo_Classreferences
path: data/pharo_Classreferences-*
dataset_info:
features:
- name: comment_sentence_id
dtype: int64
- name: class
dtype: string
- name: comment_sentence
dtype: string
- name: partition
dtype: int64
- name: instance_type
dtype: int64
- name: category
dtype: string
- name: label
dtype: int64
- name: combo
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: java_Pointer
num_bytes: 483600
num_examples: 2418
- name: java_Expand
num_bytes: 481182
num_examples: 2418
- name: java_Ownership
num_bytes: 488436
num_examples: 2418
- name: java_deprecation
num_bytes: 493272
num_examples: 2418
- name: java_rational
num_bytes: 486018
num_examples: 2418
- name: java_summary
num_bytes: 483600
num_examples: 2418
- name: java_usage
num_bytes: 478764
num_examples: 2418
- name: python_Expand
num_bytes: 421025
num_examples: 2555
- name: python_Summary
num_bytes: 423580
num_examples: 2555
- name: python_DevelopmentNotes
num_bytes: 446575
num_examples: 2555
- name: python_Parameters
num_bytes: 431245
num_examples: 2555
- name: python_Usage
num_bytes: 418470
num_examples: 2555
- name: pharo_Example
num_bytes: 368156
num_examples: 1765
- name: pharo_Keymessages
num_bytes: 375216
num_examples: 1765
- name: pharo_Responsibilities
num_bytes: 384041
num_examples: 1765
- name: pharo_Keyimplementationpoints
num_bytes: 396396
num_examples: 1765
- name: pharo_Collaborators
num_bytes: 378746
num_examples: 1765
- name: pharo_Intent
num_bytes: 366391
num_examples: 1765
- name: pharo_Classreferences
num_bytes: 382276
num_examples: 1765
download_size: 3231436
dataset_size: 8186989
task_categories:
- text-classification
size_categories:
- 10K<n<100K
---
# Dataset Card for "nlbse_ccc"
A dataset object for the NLBSE'23 Code Comment Classification competition. Please refer to the original [Github repo for more details](https://github.com/nlbse2023/code-comment-classification).
## Category distribution in the training and test sets
The table below shows the distribution of positive/negative sentences for each category in the training and testing sets.
| Language | Category | Training | Training | Testing | Testing | Total |
|----------|--------------------|---------:|---------:|---------:|---------:|-------:|
| | | **Positive** | **Negative** | **Positive** | **Negative** | |
| Java | Expand | 505 | 1426 | 127 | 360 | 2418 |
| Java | Ownership | 90 | 1839 | 25 | 464 | 2418 |
| Java | Deprecation | 100 | 1831 | 27 | 460 | 2418 |
| Java | Rational | 223 | 1707 | 57 | 431 | 2418 |
| Java | Summary | 328 | 1600 | 87 | 403 | 2418 |
| Java | Pointer | 289 | 1640 | 75 | 414 | 2418 |
| Java | Usage | 728 | 1203 | 184 | 303 | 2418 |
| | | **Positive** | **Negative** | **Positive** | **Negative** | |
| Pharo | Responsibilities | 267 | 1139 | 69 | 290 | 1765 |
| Pharo | Keymessages | 242 | 1165 | 63 | 295 | 1765 |
| Pharo | Keyimplementationpoints | 184 | 1222 | 48 | 311 | 1765 |
| Pharo | Collaborators | 99 | 1307 | 28 | 331 | 1765 |
| Pharo | Example | 596 | 812 | 152 | 205 | 1765 |
| Pharo | Classreferences | 60 | 1348 | 17 | 340 | 1765 |
| Pharo | Intent | 173 | 1236 | 45 | 311 | 1765 |
| | | **Positive** | **Negative** | **Positive** | **Negative** | |
| Python | Expand | 402 | 1637 | 102 | 414 | 2555 |
| Python | Parameters | 633 | 1404 | 161 | 357 | 2555 |
| Python | Summary | 361 | 1678 | 93 | 423 | 2555 |
| Python | Developmentnotes | 247 | 1792 | 65 | 451 | 2555 |
| Python | Usage | 637 | 1401 | 163 | 354 | 2555 |
## Code
The following code snippet was used to create the dataset:
```
# !git clone https://github.com/nlbse2023/code-comment-classification.git
from datasets import DatasetDict
langs = ['java', 'python', 'pharo']
lan_cats = []
dataset_dict = DatasetDict()
for lan in langs: # for each language
df = pd.read_csv(f'./code-comment-classification/{lan}/input/{lan}.csv')
df['label'] = df.instance_type
df['combo'] = df[['comment_sentence', 'class']].agg(' | '.join, axis=1)
print(df.columns)
cats = list(map(lambda x: lan + '_' + x, list(set(df.category))))
lan_cats = lan_cats + cats
for cat in list(set(df.category)): # for each category
filtered = df[df.category == cat]
dataset_dict[f'{lan}_{cat}'] = Dataset.from_pandas(filtered)
dataset_dict.push_to_hub("AISE-TUDelft/nlbse_ccc", token='hf_********************')
```
|
YxBxRyXJx/cat_train | ---
license: apache-2.0
---
## このデータベースは猫の飼い方に関するQAをまとめたものです。
インターネット上の英語、日本語の情報をもとに、情報を再編成してつくったものです。
LLMのファインチューニング用に使ってみてください。
コンテキストは英語です。
参考となるブログは[こちら](https://jpnqeur23lmqsw.blogspot.com/2023/09/qeur23llmdss9llm.html)
|
MalacoiHebraico/minhavoz123 | ---
license: openrail
---
|
jamil/soap_notes | ---
license: apache-2.0
---
# Dataset Card for SOAP
|
Howard001/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 2619287
num_examples: 200
download_size: 463750
dataset_size: 2619287
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B | ---
pretty_name: Evaluation run of AIJUUD/juud-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AIJUUD/juud-Mistral-7B](https://huggingface.co/AIJUUD/juud-Mistral-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T00:46:47.329333](https://huggingface.co/datasets/open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B/blob/main/results_2024-02-02T00-46-47.329333.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6298199576276905,\n\
\ \"acc_stderr\": 0.032307298049248236,\n \"acc_norm\": 0.6379923251397444,\n\
\ \"acc_norm_stderr\": 0.032981988953013575,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5412316525132941,\n\
\ \"mc2_stderr\": 0.015338639083594787\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n\
\ \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192307\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6591316470822546,\n\
\ \"acc_stderr\": 0.004730324556624128,\n \"acc_norm\": 0.8500298745269866,\n\
\ \"acc_norm_stderr\": 0.003563124427458522\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431367,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431367\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n\
\ \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348397,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532072,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532072\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5412316525132941,\n\
\ \"mc2_stderr\": 0.015338639083594787\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.01164627675508969\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2312357846853677,\n \
\ \"acc_stderr\": 0.011613587503166618\n }\n}\n```"
repo_url: https://huggingface.co/AIJUUD/juud-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-46-47.329333.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- '**/details_harness|winogrande|5_2024-02-02T00-46-47.329333.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T00-46-47.329333.parquet'
- config_name: results
data_files:
- split: 2024_02_02T00_46_47.329333
path:
- results_2024-02-02T00-46-47.329333.parquet
- split: latest
path:
- results_2024-02-02T00-46-47.329333.parquet
---
# Dataset Card for Evaluation run of AIJUUD/juud-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIJUUD/juud-Mistral-7B](https://huggingface.co/AIJUUD/juud-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T00:46:47.329333](https://huggingface.co/datasets/open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B/blob/main/results_2024-02-02T00-46-47.329333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6298199576276905,
"acc_stderr": 0.032307298049248236,
"acc_norm": 0.6379923251397444,
"acc_norm_stderr": 0.032981988953013575,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5412316525132941,
"mc2_stderr": 0.015338639083594787
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192307
},
"harness|hellaswag|10": {
"acc": 0.6591316470822546,
"acc_stderr": 0.004730324556624128,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.003563124427458522
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642507,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431367,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431367
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348397,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532072,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532072
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5412316525132941,
"mc2_stderr": 0.015338639083594787
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.01164627675508969
},
"harness|gsm8k|5": {
"acc": 0.2312357846853677,
"acc_stderr": 0.011613587503166618
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
huggingface/autotrain-data-v774-7tv3-7jbp | Invalid username or password. |
hippocrates/medical_meadow_advice_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9431718
num_examples: 8676
download_size: 2439830
dataset_size: 9431718
---
# Dataset Card for "medical_meadow_advice_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nourheshamshaheen/ICPR_testing_check | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': area
'1': heatmap
'2': horizontal bar
'3': horizontal interval
'4': line
'5': manhattan
'6': map
'7': pie
'8': scatter
'9': scatter-line
'10': surface
'11': venn
'12': vertical bar
'13': vertical box
'14': vertical interval
splits:
- name: train
num_bytes: 815174169.98
num_examples: 11388
download_size: 716823350
dataset_size: 815174169.98
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ICPR_testing_check"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/chen_hai_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chen_hai/鎮海/镇海 (Azur Lane)
This is the dataset of chen_hai/鎮海/镇海 (Azur Lane), containing 132 images and their tags.
The core tags of this character are `black_hair, breasts, large_breasts, hair_ornament, long_hair, bangs, purple_eyes, red_eyes, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 132 | 243.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 132 | 112.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 331 | 245.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 132 | 200.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 331 | 366.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chen_hai_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chen_hai_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, bodystocking, china_dress, elbow_gloves, lace-trimmed_gloves, looking_at_viewer, official_alternate_costume, pantyhose, solo, taut_dress, black_rose, black_gloves, brown_gloves, cleavage, white_background, blush, parted_lips, simple_background, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | bodystocking | china_dress | elbow_gloves | lace-trimmed_gloves | looking_at_viewer | official_alternate_costume | pantyhose | solo | taut_dress | black_rose | black_gloves | brown_gloves | cleavage | white_background | blush | parted_lips | simple_background | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:--------------|:---------------|:----------------------|:--------------------|:-----------------------------|:------------|:-------|:-------------|:-------------|:---------------|:---------------|:-----------|:-------------------|:--------|:--------------|:--------------------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
camilaslz/waguinho | ---
license: openrail
---
|
christykoh/imdb_pt | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negativo
'1': positivo
splits:
- name: train
num_bytes: 33225773
num_examples: 25000
- name: test
num_bytes: 6503491
num_examples: 5000
- name: test_all
num_bytes: 32638767
num_examples: 25000
download_size: 44980841
dataset_size: 72368031
---
# Dataset Card for "imdb_pt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eagle0504/ysa-web-scrape-dataset-qa-formatted-small-version | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 7792
num_examples: 20
download_size: 11498
dataset_size: 7792
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_stabilityai__stablelm-2-12b | ---
pretty_name: Evaluation run of stabilityai/stablelm-2-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [stabilityai/stablelm-2-12b](https://huggingface.co/stabilityai/stablelm-2-12b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-2-12b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T23:48:50.499442](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-12b/blob/main/results_2024-04-09T23-48-50.499442.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/stabilityai/stablelm-2-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|arc:challenge|25_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|gsm8k|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hellaswag|10_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T19-45-46.529445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-48-50.499442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-48-50.499442.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- '**/details_harness|winogrande|5_2024-04-09T19-45-46.529445.parquet'
- split: 2024_04_09T23_48_50.499442
path:
- '**/details_harness|winogrande|5_2024-04-09T23-48-50.499442.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T23-48-50.499442.parquet'
- config_name: results
data_files:
- split: 2024_04_09T19_45_46.529445
path:
- results_2024-04-09T19-45-46.529445.parquet
- split: 2024_04_09T23_48_50.499442
path:
- results_2024-04-09T23-48-50.499442.parquet
- split: latest
path:
- results_2024-04-09T23-48-50.499442.parquet
---
# Dataset Card for Evaluation run of stabilityai/stablelm-2-12b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-2-12b](https://huggingface.co/stabilityai/stablelm-2-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-2-12b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T23:48:50.499442](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-12b/blob/main/results_2024-04-09T23-48-50.499442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hicham12/AUDIT | ---
license: openrail
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KonradSzafer/stackoverflow_python_preprocessed | ---
dataset_info:
features:
- name: title
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 5119086
num_examples: 3296
download_size: 1939470
dataset_size: 5119086
task_categories:
- question-answering
language:
- en
pretty_name: Stack Overflow Python - Preprocessed
size_categories:
- 1K<n<10K
---
# Dataset Card for "stackoverflow_python_preprocessed"
This is a preprocessed version of the [stackoverflow_python] dataset.
Questions and answers were filtered to only include questions with more than 100 votes and answers with more than 5 votes.
The dataset has been converted from HTML to plain text and only includes the title, question, and answer columns.
## Additional Information
### License
All Stack Overflow user contributions are licensed under CC-BY-SA 3.0 with attribution required.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chirunder/transliteration_classification_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: classification
dtype: string
splits:
- name: train
num_bytes: 201288.8
num_examples: 2400
- name: test
num_bytes: 50322.2
num_examples: 600
download_size: 181466
dataset_size: 251611.0
---
# Dataset Card for "transliteration_classification_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_augtoma__qCammel-70 | ---
pretty_name: Evaluation run of augtoma/qCammel-70
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [augtoma/qCammel-70](https://huggingface.co/augtoma/qCammel-70) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augtoma__qCammel-70\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T22:35:35.594396](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70/blob/main/results_2023-10-17T22-35-35.594396.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.033766778523489936,\n\
\ \"em_stderr\": 0.001849802869119515,\n \"f1\": 0.10340918624161041,\n\
\ \"f1_stderr\": 0.0022106009828094797,\n \"acc\": 0.5700654570173166,\n\
\ \"acc_stderr\": 0.011407494958111332\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.033766778523489936,\n \"em_stderr\": 0.001849802869119515,\n\
\ \"f1\": 0.10340918624161041,\n \"f1_stderr\": 0.0022106009828094797\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2971948445792267,\n \
\ \"acc_stderr\": 0.012588685966624186\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598479\n\
\ }\n}\n```"
repo_url: https://huggingface.co/augtoma/qCammel-70
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T22_35_35.594396
path:
- '**/details_harness|drop|3_2023-10-17T22-35-35.594396.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T22-35-35.594396.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T22_35_35.594396
path:
- '**/details_harness|gsm8k|5_2023-10-17T22-35-35.594396.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T22-35-35.594396.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T22_35_35.594396
path:
- '**/details_harness|winogrande|5_2023-10-17T22-35-35.594396.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T22-35-35.594396.parquet'
- config_name: results
data_files:
- split: 2023_10_17T22_35_35.594396
path:
- results_2023-10-17T22-35-35.594396.parquet
- split: latest
path:
- results_2023-10-17T22-35-35.594396.parquet
---
# Dataset Card for Evaluation run of augtoma/qCammel-70
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augtoma/qCammel-70
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [augtoma/qCammel-70](https://huggingface.co/augtoma/qCammel-70) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augtoma__qCammel-70",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T22:35:35.594396](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70/blob/main/results_2023-10-17T22-35-35.594396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797,
"acc": 0.5700654570173166,
"acc_stderr": 0.011407494958111332
},
"harness|drop|3": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797
},
"harness|gsm8k|5": {
"acc": 0.2971948445792267,
"acc_stderr": 0.012588685966624186
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598479
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/mochida_arisa_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mochida_arisa/持田亜里沙/모치다아리사 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mochida_arisa/持田亜里沙/모치다아리사 (THE iDOLM@STER: Cinderella Girls), containing 85 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, long_hair, breasts, hair_ornament, scrunchie`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 88.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochida_arisa_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 57.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochida_arisa_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 196 | 119.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochida_arisa_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 81.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochida_arisa_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 196 | 159.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochida_arisa_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mochida_arisa_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, solo, smile, blush, hand_puppet, open_mouth, cleavage, necklace, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | blush | hand_puppet | open_mouth | cleavage | necklace | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------|:--------------|:-------------|:-----------|:-----------|:--------------------|:-------------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
maidalun1020/CrosslingualRetrievalQasEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 2986406
num_examples: 20000
- name: corpus
num_bytes: 63916553
num_examples: 79955
download_size: 40536276
dataset_size: 66902959
---
|
hippocrates/DDI2013_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 40950848
num_examples: 18779
- name: valid
num_bytes: 17341741
num_examples: 7244
- name: test
num_bytes: 12802521
num_examples: 5761
download_size: 12675431
dataset_size: 71095110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Thauab/cirogames | ---
license: openrail
---
|
vietgpt-archive/ToxicContent | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 13575089.0
num_examples: 48009
download_size: 7797242
dataset_size: 13575089.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ToxicContent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DiegoRoberto10/diegorobert10 | ---
license: openrail
---
|
pccl-org/formal-logic-simple-order-new-objects-bigger-20 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
splits:
- name: train
num_bytes: 24225
num_examples: 190
download_size: 5169
dataset_size: 24225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "formal-logic-simple-order-new-objects-bigger-20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/cv_svamp_augmented_fold4 | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Numbers
dtype: string
- name: Equation
dtype: string
- name: Answer
dtype: float64
- name: group_nums
dtype: string
- name: Body
dtype: string
- name: Ques
dtype: string
- name: question
dtype: string
- name: body
dtype: string
- name: equation
dtype: string
- name: wrong_equation
dtype: string
- name: WrongAnswer
dtype: float64
- name: label
dtype: float64
splits:
- name: train
num_bytes: 2820364
num_examples: 3954
- name: validation
num_bytes: 134291
num_examples: 184
download_size: 954858
dataset_size: 2954655
---
# Dataset Card for "cv_svamp_augmented_fold4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-3aabac9e-7554869 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: flax-community/t5-base-cnn-dm
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: flax-community/t5-base-cnn-dm
* Dataset: cnn_dailymail
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
someonehahasomeone/yeahhh | ---
license: cc
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_158 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1015950748.0
num_examples: 199519
download_size: 1036440359
dataset_size: 1015950748.0
---
# Dataset Card for "chunk_158"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/foch_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of foch/フォッシュ/福煦 (Azur Lane)
This is the dataset of foch/フォッシュ/福煦 (Azur Lane), containing 76 images and their tags.
The core tags of this character are `breasts, purple_hair, bangs, hair_between_eyes, multicolored_hair, long_hair, large_breasts, ahoge, crossed_bangs, grey_hair, red_eyes, pink_eyes, blue_hair, hair_ornament, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 76 | 113.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 76 | 55.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 116.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 76 | 95.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 175.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/foch_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, white_leotard, blush, cowboy_shot, cross_hair_ornament, epaulettes, long_sleeves, looking_at_viewer, simple_background, standing, thighhighs, white_background, cape, groin, highleg, jacket, open_mouth, smile |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_shorts, cropped_sweater, looking_at_viewer, off-shoulder_sweater, official_alternate_costume, smile, solo, white_sweater, cowboy_shot, midriff, navel, open_mouth, white_background, blush, simple_background, two-tone_hair, bag, cleavage, long_sleeves, petals, standing |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_shorts, collarbone, cropped_sweater, high-waist_shorts, looking_at_viewer, off-shoulder_sweater, official_alternate_costume, solo, standing, thigh_holster, white_sweater, blush, cleavage, handbag, sleeves_past_wrists, long_sleeves, parted_lips, shoulder_bag, smile, zipper_pull_tab, white_background, cowboy_shot, full_body, legs, shoes, two-tone_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_leotard | blush | cowboy_shot | cross_hair_ornament | epaulettes | long_sleeves | looking_at_viewer | simple_background | standing | thighhighs | white_background | cape | groin | highleg | jacket | open_mouth | smile | bare_shoulders | black_shorts | cropped_sweater | off-shoulder_sweater | official_alternate_costume | white_sweater | midriff | navel | two-tone_hair | bag | cleavage | petals | collarbone | high-waist_shorts | thigh_holster | handbag | sleeves_past_wrists | parted_lips | shoulder_bag | zipper_pull_tab | full_body | legs | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:--------|:--------------|:----------------------|:-------------|:---------------|:--------------------|:--------------------|:-----------|:-------------|:-------------------|:-------|:--------|:----------|:---------|:-------------|:--------|:-----------------|:---------------|:------------------|:-----------------------|:-----------------------------|:----------------|:----------|:--------|:----------------|:------|:-----------|:---------|:-------------|:--------------------|:----------------|:----------|:----------------------|:--------------|:---------------|:------------------|:------------|:-------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | X | | | X | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | | | X | X | | X | | X | | | | | | X | X | X | X | X | X | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X |
|
kirim9001/Ogpot | ---
license: other
---
|
liuyanchen1015/MULTI_VALUE_qqp_emphatic_reflex | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 47188
num_examples: 261
- name: test
num_bytes: 529221
num_examples: 2978
- name: train
num_bytes: 462131
num_examples: 2578
download_size: 543127
dataset_size: 1038540
---
# Dataset Card for "MULTI_VALUE_qqp_emphatic_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KeithHorgan98/autotrain-data-TweetClimateAnalysis | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: TweetClimateAnalysis
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project TweetClimateAnalysis.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "What do you do if you are a global warming alarmist and real-world temperatures do not warm as much [...]",
"target": 16
},
{
"text": "(2.) A sun-blocking volcanic aerosols component to explain the sudden but temporary cooling of globa[...]",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=18, names=['0_0', '1_1', '1_2', '1_3', '1_4', '1_6', '1_7', '2_1', '2_3', '3_1', '3_2', '3_3', '4_1', '4_2', '4_4', '4_5', '5_1', '5_2'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 23436 |
| valid | 2898 |
|
arham061/my-awesome-dataset | ---
license: apache-2.0
---
|
ben-yu/dreambooth-hackathon-nala | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 87657557.0
num_examples: 20
download_size: 87645130
dataset_size: 87657557.0
---
# Dataset Card for "dreambooth-hackathon-nala"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/toyosatomimi_no_miko_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of toyosatomimi_no_miko/豊聡耳神子/토요사토미미노미코 (Touhou)
This is the dataset of toyosatomimi_no_miko/豊聡耳神子/토요사토미미노미코 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, brown_hair, blonde_hair, brown_eyes, pointy_hair, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 555.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyosatomimi_no_miko_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 391.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyosatomimi_no_miko_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1100 | 743.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyosatomimi_no_miko_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 522.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyosatomimi_no_miko_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1100 | 923.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toyosatomimi_no_miko_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/toyosatomimi_no_miko_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bangs, bare_shoulders, bracelet, earmuffs, hair_between_eyes, looking_at_viewer, neck_ribbon, purple_ribbon, sleeveless_shirt, solo, breasts, holding, purple_skirt, ritual_baton, :d, blouse, open_mouth, simple_background, bare_arms, black_belt, blush, cowboy_shot, gradient_background, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, bracelet, earmuffs, ritual_baton, skirt, sleeveless_shirt, smile, solo, sword, belt, looking_at_viewer, open_mouth, cape, sheath |
| 2 | 7 |  |  |  |  |  | 1girl, belt, dress, earmuffs, ritual_baton, sleeveless, solo, sword, bracelet, scabbard, sheathed, skirt |
| 3 | 7 |  |  |  |  |  | 1girl, belt, bracelet, earmuffs, skirt, solo, sword, sheath, sleeveless_shirt |
| 4 | 14 |  |  |  |  |  | 1girl, earmuffs, looking_at_viewer, solo, bangs, bare_shoulders, sleeveless_shirt, bracelet, neck_ribbon, purple_ribbon, upper_body, hair_between_eyes, smile, collarbone, simple_background, closed_mouth, white_background, blush, tattoo, light_brown_hair, sailor_collar, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bangs | bare_shoulders | bracelet | earmuffs | hair_between_eyes | looking_at_viewer | neck_ribbon | purple_ribbon | sleeveless_shirt | solo | breasts | holding | purple_skirt | ritual_baton | :d | blouse | open_mouth | simple_background | bare_arms | black_belt | blush | cowboy_shot | gradient_background | white_background | skirt | smile | sword | belt | cape | sheath | dress | sleeveless | scabbard | sheathed | upper_body | collarbone | closed_mouth | tattoo | light_brown_hair | sailor_collar | small_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:-----------|:-----------|:--------------------|:--------------------|:--------------|:----------------|:-------------------|:-------|:----------|:----------|:---------------|:---------------|:-----|:---------|:-------------|:--------------------|:------------|:-------------|:--------|:--------------|:----------------------|:-------------------|:--------|:--------|:--------|:-------|:-------|:---------|:--------|:-------------|:-----------|:-----------|:-------------|:-------------|:---------------|:---------|:-------------------|:----------------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | X | X | | X | | | X | X | | | | X | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | X | | | | | | X | | | | X | | | | | | | | | | | X | | X | X | | | X | X | X | X | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | X | | | | | X | X | | | | | | | | | | | | | | | X | | X | X | | X | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | X | | | X | | | X | | X | | | | | | | | | X | X | X | X | X | X | X |
|
Falcon96/rambo | ---
license: openrail
---
|
mHossain/final_train_v1_410000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 11588553.0
num_examples: 27000
- name: test
num_bytes: 1287617.0
num_examples: 3000
download_size: 5629024
dataset_size: 12876170.0
---
# Dataset Card for "final_train_v1_410000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/CalmExperiment-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/CalmExperiment-7B-slerp](https://huggingface.co/allknowingroger/CalmExperiment-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T19:43:22.866639](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp/blob/main/results_2024-04-10T19-43-22.866639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502101478902175,\n\
\ \"acc_stderr\": 0.032034121868187146,\n \"acc_norm\": 0.6492247412475852,\n\
\ \"acc_norm_stderr\": 0.03270848748700423,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7793179945034926,\n\
\ \"mc2_stderr\": 0.013700865702514428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274777,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523203\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.716988647679745,\n\
\ \"acc_stderr\": 0.0044954128683246065,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834845,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834845\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7793179945034926,\n\
\ \"mc2_stderr\": 0.013700865702514428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693632\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/CalmExperiment-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|arc:challenge|25_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|gsm8k|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hellaswag|10_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-43-22.866639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T19-43-22.866639.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- '**/details_harness|winogrande|5_2024-04-10T19-43-22.866639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T19-43-22.866639.parquet'
- config_name: results
data_files:
- split: 2024_04_10T19_43_22.866639
path:
- results_2024-04-10T19-43-22.866639.parquet
- split: latest
path:
- results_2024-04-10T19-43-22.866639.parquet
---
# Dataset Card for Evaluation run of allknowingroger/CalmExperiment-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/CalmExperiment-7B-slerp](https://huggingface.co/allknowingroger/CalmExperiment-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T19:43:22.866639](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp/blob/main/results_2024-04-10T19-43-22.866639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6502101478902175,
"acc_stderr": 0.032034121868187146,
"acc_norm": 0.6492247412475852,
"acc_norm_stderr": 0.03270848748700423,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7793179945034926,
"mc2_stderr": 0.013700865702514428
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274777,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523203
},
"harness|hellaswag|10": {
"acc": 0.716988647679745,
"acc_stderr": 0.0044954128683246065,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834845,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834845
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7793179945034926,
"mc2_stderr": 0.013700865702514428
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693632
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Shularp/Process_tested-facebook-flores | ---
dataset_info:
features:
- name: translation
struct:
- name: ar
dtype: string
- name: en
dtype: string
- name: id
sequence: int64
splits:
- name: train
num_bytes: 361758
num_examples: 997
- name: test
num_bytes: 379791
num_examples: 1012
download_size: 412821
dataset_size: 741549
---
# Dataset Card for "Process_tested-facebook-flores"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/CodeAlpaca-20k_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6953173
num_examples: 20021
download_size: 3442058
dataset_size: 6953173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CodeAlpaca-20k_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
farnhua/MEDAL_CP | ---
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3532055082
num_examples: 3000000
- name: validation
num_bytes: 1177629097
num_examples: 1000000
- name: test
num_bytes: 1176985149
num_examples: 1000000
download_size: 3309328042
dataset_size: 5886669328
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-samsum-samsum-417ba9-2386774735 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: ARTeLab/it5-summarization-mlsum
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-mlsum
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Back-up/train-classification-1k | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: update
dtype: int64
splits:
- name: train
num_bytes: 13575089.0
num_examples: 48009
download_size: 7797354
dataset_size: 13575089.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "train-classification-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VatsaDev/oh2.5-text | ---
license: mit
---
|
Intuit-GenSRF/hate-speech-offensive | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 2576536
num_examples: 24783
download_size: 1560109
dataset_size: 2576536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hate_speech_offensive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B | ---
pretty_name: Evaluation run of yunconglong/Truthful_DPO_MOE_19B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/Truthful_DPO_MOE_19B](https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T05:49:46.084708](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B/blob/main/results_2024-01-21T05-49-46.084708.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6651543092121811,\n\
\ \"acc_stderr\": 0.03170223263040272,\n \"acc_norm\": 0.6659497156886632,\n\
\ \"acc_norm_stderr\": 0.03234845655117025,\n \"mc1\": 0.5801713586291309,\n\
\ \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7229451135419377,\n\
\ \"mc2_stderr\": 0.014949043344645354\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623496,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7132045409281019,\n\
\ \"acc_stderr\": 0.004513409114983828,\n \"acc_norm\": 0.8845847440748855,\n\
\ \"acc_norm_stderr\": 0.003188694028453633\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5801713586291309,\n\
\ \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7229451135419377,\n\
\ \"mc2_stderr\": 0.014949043344645354\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781096\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.645185746777862,\n \
\ \"acc_stderr\": 0.013179083387979205\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|arc:challenge|25_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|gsm8k|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hellaswag|10_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T05-49-46.084708.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- '**/details_harness|winogrande|5_2024-01-21T05-49-46.084708.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T05-49-46.084708.parquet'
- config_name: results
data_files:
- split: 2024_01_21T05_49_46.084708
path:
- results_2024-01-21T05-49-46.084708.parquet
- split: latest
path:
- results_2024-01-21T05-49-46.084708.parquet
---
# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_MOE_19B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/Truthful_DPO_MOE_19B](https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T05:49:46.084708](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B/blob/main/results_2024-01-21T05-49-46.084708.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6651543092121811,
"acc_stderr": 0.03170223263040272,
"acc_norm": 0.6659497156886632,
"acc_norm_stderr": 0.03234845655117025,
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.7229451135419377,
"mc2_stderr": 0.014949043344645354
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623496,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.7132045409281019,
"acc_stderr": 0.004513409114983828,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.003188694028453633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.0254942593506949,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.0254942593506949
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.0127686730761119,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.0127686730761119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.7229451135419377,
"mc2_stderr": 0.014949043344645354
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781096
},
"harness|gsm8k|5": {
"acc": 0.645185746777862,
"acc_stderr": 0.013179083387979205
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CognitiveScience/coscidata | ---
license: mit
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
license: mit
task_categories:
- image-research
language:
- en
tags:
- Grayscale Images
- ASCII Labels
pretty_name: coscidata
size_categories:
- 100K<n<1M
---
# AlphaNum Dataset

## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lordsymbol/lordsymbol | ---
license: openrail
---
|
hackathon-pln-es/unam_tesis | ---
annotations_creators:
- MajorIsaiah
- Ximyer
- clavel
- inoid
language_creators: [crowdsourced]
language: [es]
license: [apache-2.0]
multilinguality: [monolingual]
pretty_name: UNAM Tesis
size_categories:
- n=200
source_datasets: [original]
task_categories: [text-classification]
task_ids: [language-modeling]
---
# Dataset Card of "unam_tesis"
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
- [yiselclavel@gmail.com](mailto:yiselclavel@gmail.com)
- [isaac7isaias@gmail.com](mailto:isaac7isaias@gmail.com)
### Dataset Summary
El dataset unam_tesis cuenta con 1000 tesis de 5 carreras de la Universidad Nacional Autónoma de México (UNAM), 200 por carrera. Se pretende seguir incrementando este dataset con las demás carreras y más tesis.
### Supported Tasks and Leaderboards
text-classification
### Languages
Español (es)
## Dataset Structure
### Data Instances
Las instancias del dataset son de la siguiente forma:
El objetivo de esta tesis es elaborar un estudio de las condiciones asociadas al aprendizaje desde casa a nivel preescolar y primaria en el municipio de Nicolás Romero a partir de la cancelación de clases presenciales ante la contingencia sanitaria del Covid-19 y el entorno familiar del estudiante. En México, la Encuesta para la Medición del Impacto COVID-19 en la Educación (ECOVID-ED) 2020, es un proyecto que propone el INEGI y realiza de manera especial para conocer las necesidades de la población estudiantil de 3 a 29 años de edad, saber qué está sucediendo con su entorno inmediato, las condiciones en las que desarrollan sus actividades académicas y el apoyo que realizan padres, tutores o cuidadores principales de las personas en edad formativa. La ECOVID-ED 2020 se llevó a cabo de manera especial con el objetivo de conocer el impacto de la cancelación provisional de clases presenciales en las instituciones educativas del país para evitar los contagios por la pandemia COVID-19 en la experiencia educativa de niños, niñas, adolescentes y jóvenes de 3 a 29 años, tanto en el ciclo escolar 2019-2020, como en ciclo 2020-2021. En este ámbito de investigación, el Instituto de Investigaciones sobre la Universidad y la Educación (IISUE) de la Universidad Nacional Autónoma de México público en 2020 la obra “Educación y Pandemia: Una visión académica” que se integran 34 trabajos que abordan la muy amplia temática de la educación y la universidad con reflexiones y ejercicios analíticos estrechamente relacionadas en el marco coyuntural de la pandemia COVID-19. La tesis se presenta en tres capítulos: En el capítulo uno se realizará una descripción del aprendizaje de los estudiantes a nivel preescolar y primaria del municipio de NicolásRomero, Estado de México, que por motivo de la contingencia sanitaria contra el Covid-19 tuvieron que concluir su ciclo académico 2019-2020 y el actual ciclo 2020-2021 en su casa debido a la cancelación provisional de clases presenciales y bajo la tutoría de padres, familiar o ser cercano; así como las horas destinadas al estudio y las herramientas tecnológicas como teléfonos inteligentes, computadoras portátiles, computadoras de escritorio, televisión digital y tableta. En el capítulo dos, se presentarán las herramientas necesarias para la captación de la información mediante técnicas de investigación social, a través de las cuales se mencionará, la descripción, contexto y propuestas del mismo, considerando los diferentes tipos de cuestionarios, sus componentes y diseño, teniendo así de manera específica la diversidad de ellos, que llevarán como finalidad realizar el cuestionario en línea para la presente investigación. Posteriormente, se podrá destacar las fases del diseño de la investigación, que se realizarán mediante una prueba piloto tomando como muestra a distintos expertos en el tema. De esta manera se obtendrá la información relevante para estudiarla a profundidad. En el capítulo tres, se realizará el análisis apoyado de las herramientas estadísticas, las cuales ofrecen explorar la muestra de una manera relevante, se aplicará el método inferencial para expresar la información y predecir las condiciones asociadas al autoaprendizaje, la habilidad pedagógica de padres o tutores, la convivencia familiar, la carga académica y actividades escolares y condicionamiento tecnológico,con la finalidad de inferir en la población. Asimismo, se realizarán pruebas de hipótesis, tablas de contingencia y matriz de correlación. Por consiguiente, los resultados obtenidos de las estadísticas se interpretarán para describir las condiciones asociadas y como impactan en la enseñanza de preescolar y primaria desde casa.|María de los Ángeles|Blancas Regalado|Análisis de las condiciones del aprendizaje desde casa en los alumnos de preescolar y primaria del municipio de Nicolás Romero |2022|Actuaría
| Carreras | Número de instancias |
|--------------|----------------------|
| Actuaría | 200 |
| Derecho| 200 |
| Economía| 200 |
| Psicología| 200 |
| Química Farmacéutico Biológica| 200 |
### Data Fields
El dataset está compuesto por los siguientes campos: "texto|titulo|carrera". <br/>
texto: Se refiere al texto de la introducción de la tesis. <br/>
titulo: Se refiere al título de la tesis. <br/>
carrera: Se refiere al nombre de la carrera a la que pertenece la tesis. <br/>
### Data Splits
El dataset tiene 2 particiones: entrenamiento (train) y prueba (test).
| Partición | Número de instancias |
|--------------|-------------------|
| Entrenamiento | 800 |
| Prueba | 200 |
## Dataset Creation
### Curation Rationale
La creación de este dataset ha sido motivada por la participación en el Hackathon 2022 de PLN en Español organizado por Somos NLP, con el objetivo de democratizar el NLP en español y promover su aplicación a buenas causas y, debido a que no existe un dataset de tesis en español.
### Source Data
#### Initial Data Collection and Normalization
El dataset original (dataset_tesis) fue creado a partir de un proceso de scraping donde se extrajeron tesis de la Universidad Nacional Autónoma de México en el siguiente link: https://tesiunam.dgb.unam.mx/F?func=find-b-0&local_base=TES01.
Se optó por realizar un scraper para conseguir la información. Se decidió usar la base de datos TESIUNAM, la cual es un catálogo en donde se pueden visualizar las tesis de los sustentantes que obtuvieron un grado en la UNAM, así como de las tesis de licenciatura de escuelas incorporadas a ella.
Para ello, en primer lugar se consultó la Oferta Académica (http://oferta.unam.mx/indice-alfabetico.html) de la Universidad, sitio de donde se extrajo cada una de las 131 licenciaturas en forma de lista. Después, se analizó cada uno de los casos presente en la base de datos, debido a que existen carreras con más de 10 tesis, otras con menos de 10, o con solo una o ninguna tesis disponible. Se usó Selenium para la interacción con un navegador Web (Edge) y está actualmente configurado para obtener las primeras 20 tesis, o menos, por carrera.
Este scraper obtiene de esta base de datos:
- Nombres del Autor
- Apellidos del Autor
- Título de la Tesis
- Año de la Tesis
- Carrera de la Tesis
A la vez, este scraper descarga cada una de las tesis en la carpeta Downloads del equipo local. En el csv formado por el scraper se añadió el "Resumen/Introduccion/Conclusion de la tesis", dependiendo cual primero estuviera disponible, ya que la complejidad recae en la diferencia de la estructura y formato de cada una de las tesis.
#### Who are the source language producers?
Los datos son creados por humanos de forma manual, en este caso por estudiantes de la UNAM y revisados por sus supervisores.
### Annotations
El dataset fue procesado para eliminar información innecesaria para los clasificadores. El dataset original cuenta con los siguientes campos: "texto|autor_nombre|autor_apellido|titulo|año|carrera".
#### Annotation process
Se extrajeron primeramente 200 tesis de 5 carreras de esta universidad: Actuaría, Derecho, Economía, Psicología y Química Farmacéutico Biológica. De estas se extrajo: introducción, nombre del autor, apellidos de autor, título de la tesis y la carrera. Los datos fueron revisados y limpiados por los autores.
Luego, el dataset fue procesado con las siguientes tareas de Procesamiento de Lenguaje Natural (dataset_tesis_procesado):
- convertir a minúsculas
- tokenización
- eliminar palabras que no son alfanuméricas
- eliminar palabras vacías
- stemming: eliminar plurales
#### Who are the annotators?
Las anotaciones fueron hechas por humanos, en este caso los autores del dataset, usando código de máquina en el lenguaje Python.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
El presente conjunto de datos favorecerá la búsqueda e investigación relacionada con tesis en español, a partir de su categorización automática por un modelo entrenado con este dataset. Esta tarea favorece el cumplimiento del objetivo 4 de Desarrollo Sostenible de la ONU: Educación y Calidad (https://www.un.org/sustainabledevelopment/es/objetivos-de-desarrollo-sostenible/).
### Discussion of Biases
El texto tiene algunos errores en la codificación por lo que algunos caracteres como las tildes no se muestran correctamente. Las palabras con estos caracteres son eliminadas en el procesamiento hasta que se corrija el problema.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Miembros del equipo (user de Hugging Face):
[Isacc Isahias López López](https://huggingface.co/MajorIsaiah)
[Yisel Clavel Quintero](https://huggingface.co/clavel)
[Dionis López](https://huggingface.co/inoid)
[Ximena Yeraldin López López](https://huggingface.co/Ximyer)
### Licensing Information
La versión 1.0.0 del dataset unam_tesis está liberada bajo la licencia <a href='http://www.apache.org/licenses/LICENSE-2.0'/> Apache-2.0 License </a>.
### Citation Information
"Esta base de datos se ha creado en el marco del Hackathon 2022 de PLN en Español organizado por Somos NLP patrocinado por Platzi, Paperspace y Hugging Face: https://huggingface.co/hackathon-pln-es."
Para citar este dataset, por favor, use el siguiente formato de cita:
@inproceedings{Hackathon 2022 de PLN en Español,
title={UNAM's Theses with BETO fine-tuning classify},
author={López López, Isaac Isaías; Clavel Quintero, Yisel; López Ramos, Dionis & López López, Ximena Yeraldin},
booktitle={Hackathon 2022 de PLN en Español},
year={2022}
}
### Contributions
Gracias a [@yiselclavel](https://github.com/yiselclavel) y [@IsaacIsaias](https://github.com/IsaacIsaias) por agregar este dataset.
|
CATIE-AQ/amazon_massive_intent_fr_prompt_intent_classification | ---
language:
- fr
license: apache-2.0
size_categories:
- 100K<n<1M
task_categories:
- text-classification
tags:
- intent-classification
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- amazon_massive_intent
---
# amazon_massive_intent_fr_prompt_intent_classification
## Summary
**amazon_massive_intent_fr_prompt_intent_classification** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **555,000** rows that can be used for an intent text classification task.
The original data (without prompts) comes from the dataset [amazon_massive_intent_fr-FR](https://huggingface.co/datasets/SetFit/amazon_massive_intent_fr-FR) by FitzGerald et al..
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
30 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
text+'\n Étant donné la liste de catégories suivante : "'+classes+'" à quelle catégorie appartient le texte ?',
text+'\n Étant donné la liste de classes suivante : "'+classes+'" à quelle classe appartient le texte ?',
'Étant donné une liste de catégories : "'+classes+'" à quelle catégorie appartient le texte suivant ?\n Texte : '+text,
'Étant donné une liste de classes : "'+classes+'" à quelle classe appartient le texte suivant ?\n Texte : '+text,
'Étant donné un choix de catégories : "'+classes+'", le texte fait référence à laquelle ?\n Texte : '+text,
'Étant donné un choix de classe : "'+classes+'", le texte fait référence à laquelle ?\n Texte : '+text,
'Choisir une catégorie pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une catégorie pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une catégorie pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text,
'Choisir une classe pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une classe pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une classe pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text,
'Sélectionner une catégorie pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une catégorie pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une catégorie pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text,
'Sélectionner une classe pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une classe pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une classe pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text,
'Parmi la liste de catégories suivantes : "'+classes+'",\n indiquer celle présente dans le texte : '+text,
'Parmi la liste de classes suivantes : "'+classes+'",\n indiquer celle présente dans le texte : '+text,
"""Parmi la liste d'intentions suivantes : " """+classes+""" ",\n indiquer celle présente dans le texte : """+text,
text+"""\n Étant donné la liste d'intentions suivante : " """+classes+""" ", à quelle intention appartient le texte ?""",
"""Étant donné une liste d'intentions : " """+classes+""" ", à quelle intention appartient le texte suivant ?\n Texte : """+text,
"""Étant donné un choix d'intentions : " """+classes+""" ", le texte fait référence à laquelle ?""",
'Choisir une intention pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une intention pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Choisir une intention pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text,
'Sélectionner une intention pour le texte suivant. Les options sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une intention pour le texte suivant. Les possibilités sont les suivantes : "'+classes+'"\n Texte : '+text,
'Sélectionner une intention pour le texte suivant. Les choix sont les suivants : "'+classes+'"\n Texte : '+text
```
### Features used in the prompts
In the prompt list above, `classes`, `text` and `targets` have been constructed from:
```
massive = load_dataset('SetFit/amazon_massive_intent_fr-FR')
classes = 'audio_volume_other, play_music, iot_hue_lighton, general_greet, calendar_set, audio_volume_down, social_query, audio_volume_mute, iot_wemo_on, iot_hue_lightup, audio_volume_up, iot_coffee, takeaway_query, qa_maths, play_game, cooking_query, iot_hue_lightdim, iot_wemo_off, music_settings, weather_query, news_query, alarm_remove, social_post, recommendation_events, transport_taxi, takeaway_order, music_query, calendar_query, lists_query, qa_currency, recommendation_movies, general_joke, recommendation_locations, email_querycontact, lists_remove, play_audiobook, email_addcontact, lists_createoradd, play_radio, qa_stock, alarm_query, email_sendemail, general_quirky, music_likeness, cooking_recipe, email_query, datetime_query, transport_traffic, play_podcasts, iot_hue_lightchange, calendar_remove, transport_query, transport_ticket, qa_factoid, iot_cleaning, alarm_set, datetime_convert, iot_hue_lightoff, qa_definition, music_dislikeness'
text = massive['train']['text'][i]
targets = massive['train']['label_text'][i]
```
# Splits
- `train` with 345,000 samples
- `valid` with 105,000 samples
- `test` with 105,000 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/amazon_massive_intent_fr_prompt_intent_classification")
```
# Citation
## Original data
> @misc{fitzgerald2022massive,
title={MASSIVE: A 1M-Example Multilingual Natural Language Understanding Dataset with 51 Typologically-Diverse Languages},
author={Jack FitzGerald and Christopher Hench and Charith Peris and Scott Mackie and Kay Rottmann and Ana Sanchez and Aaron Nash and Liam Urbach and Vishesh Kakarala and Richa Singh and Swetha Ranganath and Laurie Crist and Misha Britan and Wouter Leeuwis and Gokhan Tur and Prem Natarajan},
year={2022},
eprint={2204.08582},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
Apache 2.0 |
open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v4 | ---
pretty_name: Evaluation run of CobraMamba/mamba-gpt-3b-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CobraMamba/mamba-gpt-3b-v4](https://huggingface.co/CobraMamba/mamba-gpt-3b-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T00:01:02.690756](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v4/blob/main/results_2023-10-25T00-01-02.690756.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011954697986577181,\n\
\ \"em_stderr\": 0.0011130056898859247,\n \"f1\": 0.0627841862416108,\n\
\ \"f1_stderr\": 0.0016440985205687317,\n \"acc\": 0.3325355902710252,\n\
\ \"acc_stderr\": 0.007798820060438671\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.011954697986577181,\n \"em_stderr\": 0.0011130056898859247,\n\
\ \"f1\": 0.0627841862416108,\n \"f1_stderr\": 0.0016440985205687317\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.00226753710225448\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6582478295185478,\n \"acc_stderr\": 0.013330103018622863\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CobraMamba/mamba-gpt-3b-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|arc:challenge|25_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T00_01_02.690756
path:
- '**/details_harness|drop|3_2023-10-25T00-01-02.690756.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T00-01-02.690756.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T00_01_02.690756
path:
- '**/details_harness|gsm8k|5_2023-10-25T00-01-02.690756.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T00-01-02.690756.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hellaswag|10_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-17-28.228620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T14-17-28.228620.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T14-17-28.228620.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T00_01_02.690756
path:
- '**/details_harness|winogrande|5_2023-10-25T00-01-02.690756.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T00-01-02.690756.parquet'
- config_name: results
data_files:
- split: 2023_09_11T14_17_28.228620
path:
- results_2023-09-11T14-17-28.228620.parquet
- split: 2023_10_25T00_01_02.690756
path:
- results_2023-10-25T00-01-02.690756.parquet
- split: latest
path:
- results_2023-10-25T00-01-02.690756.parquet
---
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-3b-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b-v4](https://huggingface.co/CobraMamba/mamba-gpt-3b-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T00:01:02.690756](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v4/blob/main/results_2023-10-25T00-01-02.690756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859247,
"f1": 0.0627841862416108,
"f1_stderr": 0.0016440985205687317,
"acc": 0.3325355902710252,
"acc_stderr": 0.007798820060438671
},
"harness|drop|3": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859247,
"f1": 0.0627841862416108,
"f1_stderr": 0.0016440985205687317
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.00226753710225448
},
"harness|winogrande|5": {
"acc": 0.6582478295185478,
"acc_stderr": 0.013330103018622863
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6832397060915553,\n\
\ \"acc_stderr\": 0.031693477754770626,\n \"acc_norm\": 0.6869592578044069,\n\
\ \"acc_norm_stderr\": 0.03166529474407705,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6818362875921131,\n\
\ \"acc_stderr\": 0.004648115322328777,\n \"acc_norm\": 0.873132842063334,\n\
\ \"acc_norm_stderr\": 0.0033214390244115494\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534422,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534422\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n\
\ \"acc_stderr\": 0.016442830654715548,\n \"acc_norm\": 0.5910614525139665,\n\
\ \"acc_norm_stderr\": 0.016442830654715548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398195,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398195\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898968,\n\
\ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- config_name: results
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- results_2023-09-04T02:38:01.038212.parquet
- split: latest
path:
- results_2023-09-04T02:38:01.038212.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6832397060915553,
"acc_stderr": 0.031693477754770626,
"acc_norm": 0.6869592578044069,
"acc_norm_stderr": 0.03166529474407705,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725227
},
"harness|hellaswag|10": {
"acc": 0.6818362875921131,
"acc_stderr": 0.004648115322328777,
"acc_norm": 0.873132842063334,
"acc_norm_stderr": 0.0033214390244115494
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534422,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534422
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080438,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878456,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878456
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5910614525139665,
"acc_stderr": 0.016442830654715548,
"acc_norm": 0.5910614525139665,
"acc_norm_stderr": 0.016442830654715548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398195,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398195
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8024691358024691,
"acc_stderr": 0.022152889927898968,
"acc_norm": 0.8024691358024691,
"acc_norm_stderr": 0.022152889927898968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_269 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1122609604
num_examples: 218747
download_size: 1144088564
dataset_size: 1122609604
---
# Dataset Card for "chunk_269"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelmallari/sportsbook-nhl | ---
license: mit
---
|
Farisya/ft-usermanualv2 | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 43663
num_examples: 51
- name: test
num_bytes: 7388
num_examples: 9
download_size: 21135
dataset_size: 51051
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hjawad367/segformer-b0-finetuned-segments-sidewalk-oct-22 | ---
license: mit
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3138225.0
num_examples: 10
download_size: 3139734
dataset_size: 3138225.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/iizunamaru_megumu_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of iizunamaru_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou)
This is the dataset of iizunamaru_megumu/飯綱丸龍/이이즈나마루메구무 (Touhou), containing 45 images and their tags.
The core tags of this character are `hat, long_hair, tokin_hat, red_eyes, blue_hair, blue_headwear, breasts, pointy_ears, hair_between_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 70.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 38.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 117 | 80.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 61.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 117 | 116.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iizunamaru_megumu_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/iizunamaru_megumu_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, blue_dress, frilled_dress, pom_pom_(clothes), ribbon_trim, sleeveless_coat, solo, kneehighs, tengu-geta, black_socks, smile, gem, purple_footwear, open_mouth, looking_at_viewer, black_coat |
| 1 | 6 |  |  |  |  |  | 1girl, blue_dress, frilled_dress, pom_pom_(clothes), ribbon_trim, sleeveless_coat, solo, gem, kneehighs, looking_at_viewer, starry_sky, black_socks, closed_mouth, night_sky, smile, cloud, pauldrons, tengu-geta |
| 2 | 12 |  |  |  |  |  | 1girl, blue_dress, frilled_dress, ribbon_trim, solo, gem, sleeveless_coat, pom_pom_(clothes), large_breasts, simple_background, looking_at_viewer, smile, white_background, wings, black_coat, closed_mouth, cowboy_shot, earrings |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_dress | frilled_dress | pom_pom_(clothes) | ribbon_trim | sleeveless_coat | solo | kneehighs | tengu-geta | black_socks | smile | gem | purple_footwear | open_mouth | looking_at_viewer | black_coat | starry_sky | closed_mouth | night_sky | cloud | pauldrons | large_breasts | simple_background | white_background | wings | cowboy_shot | earrings |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:----------------|:--------------------|:--------------|:------------------|:-------|:------------|:-------------|:--------------|:--------|:------|:------------------|:-------------|:--------------------|:-------------|:-------------|:---------------|:------------|:--------|:------------|:----------------|:--------------------|:-------------------|:--------|:--------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | | | X | X | | X | | | | X | X | X | X | X | X |
|
Dampish/Proccessed-GPT-NEO | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 16099408241
num_examples: 3943800
download_size: 4189123262
dataset_size: 16099408241
---
|
distilled-from-one-sec-cv12/chunk_22 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1081620320
num_examples: 210760
download_size: 1105089406
dataset_size: 1081620320
---
# Dataset Card for "chunk_22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3715966.658418829
num_examples: 2874
- name: validation
num_bytes: 232483
num_examples: 200
download_size: 1053957
dataset_size: 3948449.658418829
---
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mila-intel/ProtST-SubcellularLocalization | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: prot_seq
dtype: string
- name: localization
dtype: int64
splits:
- name: train
num_bytes: 8077128
num_examples: 8420
- name: validation
num_bytes: 2678401
num_examples: 2811
- name: test
num_bytes: 2742147
num_examples: 2773
download_size: 8912300
dataset_size: 13497676
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
HuggingFaceM4/VQAv2_modif | Invalid username or password. |
ricardo-filho/test_americanas | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: sentence
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 83378
num_examples: 961
download_size: 50417
dataset_size: 83378
---
# Dataset Card for "test_americanas"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/eunhwa_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eunhwa/ウンファ/银华/은화 (Nikke: Goddess of Victory)
This is the dataset of eunhwa/ウンファ/银华/은화 (Nikke: Goddess of Victory), containing 43 images and their tags.
The core tags of this character are `black_hair, bangs, long_hair, purple_eyes, breasts, hat, multicolored_hair, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 52.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eunhwa_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 31.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eunhwa_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 102 | 64.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eunhwa_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 48.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eunhwa_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 102 | 89.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eunhwa_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eunhwa_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, black_headwear, fingerless_gloves, black_shirt, black_thighhighs, long_sleeves, medium_breasts, purple_hair, sailor_collar, black_jacket, black_panties, closed_mouth, cowboy_shot, crop_top, holding_weapon, neckerchief, rifle, thighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | black_headwear | fingerless_gloves | black_shirt | black_thighhighs | long_sleeves | medium_breasts | purple_hair | sailor_collar | black_jacket | black_panties | closed_mouth | cowboy_shot | crop_top | holding_weapon | neckerchief | rifle | thighs | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:-----------------|:--------------------|:--------------|:-------------------|:---------------|:-----------------|:--------------|:----------------|:---------------|:----------------|:---------------|:--------------|:-----------|:-----------------|:--------------|:--------|:---------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ericyu/CLCD_Cropped_256 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 29228609.52
num_examples: 1440
- name: test
num_bytes: 9716986.0
num_examples: 480
- name: val
num_bytes: 9686310.0
num_examples: 480
download_size: 48264072
dataset_size: 48631905.519999996
---
# Dataset Card for "CLCD_Cropped_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mncai/orca_dpo_pairs_ko | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1.1.0 | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_Asura_v1.1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_Asura_v1.1.0](https://huggingface.co/JaeyeonKang/CCK_Asura_v1.1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1.1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T05:03:46.732559](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1.1.0/blob/main/results_2024-02-18T05-03-46.732559.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7531028042999279,\n\
\ \"acc_stderr\": 0.02849607935807537,\n \"acc_norm\": 0.7561885555672273,\n\
\ \"acc_norm_stderr\": 0.02904469253900948,\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6955283891406179,\n\
\ \"mc2_stderr\": 0.01479273302144055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276511,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.01294203019513643\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7102170882294364,\n\
\ \"acc_stderr\": 0.0045273436511307965,\n \"acc_norm\": 0.8854809798844852,\n\
\ \"acc_norm_stderr\": 0.0031778979482849357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785136,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785136\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.026280550932848087,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.026280550932848087\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.03320556443085569,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.03320556443085569\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n\
\ \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5396825396825397,\n \"acc_stderr\": 0.025670080636909308,\n \"\
acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.025670080636909308\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n\
\ \"acc_stderr\": 0.019469334586486933,\n \"acc_norm\": 0.864516129032258,\n\
\ \"acc_norm_stderr\": 0.019469334586486933\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969565,\n\
\ \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\"\
: 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246815,\n\
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246815\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4222222222222222,\n \"acc_stderr\": 0.030114442019668092,\n \
\ \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.030114442019668092\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683258,\n \
\ \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683258\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163085,\n \"\
acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163085\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"\
acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7083333333333334,\n \"acc_stderr\": 0.030998666304560517,\n \"\
acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.030998666304560517\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723305,\n \"\
acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723305\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8952745849297573,\n\
\ \"acc_stderr\": 0.010949664098633358,\n \"acc_norm\": 0.8952745849297573,\n\
\ \"acc_norm_stderr\": 0.010949664098633358\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.659217877094972,\n\
\ \"acc_stderr\": 0.015852002449862096,\n \"acc_norm\": 0.659217877094972,\n\
\ \"acc_norm_stderr\": 0.015852002449862096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514307,\n\
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597256,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597256\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5921985815602837,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5958279009126467,\n\
\ \"acc_stderr\": 0.012533504046491367,\n \"acc_norm\": 0.5958279009126467,\n\
\ \"acc_norm_stderr\": 0.012533504046491367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915376,\n \
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9203980099502488,\n\
\ \"acc_stderr\": 0.01913968563350382,\n \"acc_norm\": 0.9203980099502488,\n\
\ \"acc_norm_stderr\": 0.01913968563350382\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6955283891406179,\n\
\ \"mc2_stderr\": 0.01479273302144055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.01279935367580183\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_Asura_v1.1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|arc:challenge|25_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|gsm8k|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hellaswag|10_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-03-46.732559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T05-03-46.732559.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- '**/details_harness|winogrande|5_2024-02-18T05-03-46.732559.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T05-03-46.732559.parquet'
- config_name: results
data_files:
- split: 2024_02_18T05_03_46.732559
path:
- results_2024-02-18T05-03-46.732559.parquet
- split: latest
path:
- results_2024-02-18T05-03-46.732559.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1.1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v1.1.0](https://huggingface.co/JaeyeonKang/CCK_Asura_v1.1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1.1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T05:03:46.732559](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1.1.0/blob/main/results_2024-02-18T05-03-46.732559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7531028042999279,
"acc_stderr": 0.02849607935807537,
"acc_norm": 0.7561885555672273,
"acc_norm_stderr": 0.02904469253900948,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6955283891406179,
"mc2_stderr": 0.01479273302144055
},
"harness|arc:challenge|25": {
"acc": 0.6936860068259386,
"acc_stderr": 0.013470584417276511,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.01294203019513643
},
"harness|hellaswag|10": {
"acc": 0.7102170882294364,
"acc_stderr": 0.0045273436511307965,
"acc_norm": 0.8854809798844852,
"acc_norm_stderr": 0.0031778979482849357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785136,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785136
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848087,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848087
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.03320556443085569,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.03320556443085569
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.025670080636909308,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.025670080636909308
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486933,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486933
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246815,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246815
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.030114442019668092,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.030114442019668092
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683258,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683258
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163085,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163085
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.030998666304560517,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.030998666304560517
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.022683403691723305,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.022683403691723305
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8952745849297573,
"acc_stderr": 0.010949664098633358,
"acc_norm": 0.8952745849297573,
"acc_norm_stderr": 0.010949664098633358
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442272
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.659217877094972,
"acc_stderr": 0.015852002449862096,
"acc_norm": 0.659217877094972,
"acc_norm_stderr": 0.015852002449862096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514307,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597256,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597256
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5958279009126467,
"acc_stderr": 0.012533504046491367,
"acc_norm": 0.5958279009126467,
"acc_norm_stderr": 0.012533504046491367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.015076937921915376,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.015076937921915376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9203980099502488,
"acc_stderr": 0.01913968563350382,
"acc_norm": 0.9203980099502488,
"acc_norm_stderr": 0.01913968563350382
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6955283891406179,
"mc2_stderr": 0.01479273302144055
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250697
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.01279935367580183
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
JamieWithofs/Deepfake-and-real-images-4 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Fake
'1': Real
splits:
- name: train
num_bytes: 2997526316.387
num_examples: 121159
- name: test
num_bytes: 998844443.2
num_examples: 35304
- name: validation
num_bytes: 664886328.544
num_examples: 53184
download_size: 3847076562
dataset_size: 4661257088.131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
vietgpt/ultrachat | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_1024
num_bytes: 1395429967.5845509
num_examples: 270812
- name: train_2048
num_bytes: 2779271371.8960648
num_examples: 539375
- name: train_4096
num_bytes: 3360683349.1806855
num_examples: 652210
download_size: 3454050489
dataset_size: 7535384688.661301
configs:
- config_name: default
data_files:
- split: train_1024
path: data/train_1024-*
- split: train_2048
path: data/train_2048-*
- split: train_4096
path: data/train_4096-*
---
# Dataset Card for "ultrachat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/clean-dataset-preview-zero | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: model_input
dtype: string
- name: response
dtype: string
- name: edited_response
dtype: string
- name: user_id
dtype: string
- name: check_nsfw_words_criteria
dtype: float64
splits:
- name: train
num_bytes: 115524003.7411831
num_examples: 50510
download_size: 45480394
dataset_size: 115524003.7411831
---
# Dataset Card for "clean-dataset-preview-zero"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lazycuber/Open-hermes-2.5-alpaca | ---
language:
- en
license: apache-2.0
---
|
hou222/coco2023 | ---
license: bigscience-openrail-m
---
|
kyujinpy/KoCommercial-NoSSL | ---
language:
- ko
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 187990458
num_examples: 175454
download_size: 110149618
dataset_size: 187990458
---
# Dataset for kyujinpy/KoCommercial-NoSSL
## Info
**Dataset 개수:** 약 175K
**License:** CC-BY-NC-4.0 (*통합에 활용한 각 데이터셋은 모두 상업적 용도로 사용가능.)
**Dataset list(전부 상업적 용도로 이용가능)**
1. [kyujinpy/KOpen-platypus](kyujinpy/KOpen-platypus) (*Except non-commercial datasets)
2. [beomi/KoAlpaca-v1.1a](https://huggingface.co/datasets/beomi/KoAlpaca-v1.1a)
3. [HumanF-MarkrAI/WIKI_QA_Near_dedup](https://huggingface.co/datasets/HumanF-MarkrAI/WIKI_QA_Near_dedup)
4. [KorQuadv1.0](https://korquad.github.io/KorQuad%201.0/)
# Another Dataset
- [kyujinpy/KoCommercial-SSL](https://huggingface.co/datasets/kyujinpy/KoCommercial-SSL).
- [MarkrAI/KoCommercial-Dataset](https://huggingface.co/datasets/MarkrAI/KoCommercial-Dataset).
|
Back-up/test-stsv-data | ---
dataset_info:
features:
- name: Answers
dtype: string
- name: Questions
dtype: string
splits:
- name: train
num_bytes: 104773.87782426778
num_examples: 496
download_size: 47625
dataset_size: 104773.87782426778
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test-stsv-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/e13426e5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 166
num_examples: 10
download_size: 1307
dataset_size: 166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e13426e5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/biology_dataset_standardized_cluster_1 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lansinuote/nlp.7.translation | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 8389390
num_examples: 20000
- name: validation
num_bytes: 84758
num_examples: 200
- name: test
num_bytes: 84885
num_examples: 200
download_size: 0
dataset_size: 8559033
---
# Dataset Card for "nlp.7.translation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jpcorb20/medical_wikipedia | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: wiki_id
dtype: int32
- name: paragraph_id
dtype: int32
- name: topic_infer
dtype: int64
- name: prob
dtype: float64
splits:
- name: train
num_bytes: 565706758
num_examples: 1139464
download_size: 0
dataset_size: 565706758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- en
tags:
- medical
pretty_name: w
size_categories:
- 1M<n<10M
---
# MedWiki from ClinicalCorp
This is a filtered version of the `Cohere/wikipedia-22-12` on medical topic articles using `MaartenGr/BERTopic_Wikipedia`. Keep note that some articles in the viewer might seem off topic, but usually they are related in some way (e.g. World War I is linked to the Spanish Flu). This is artefacts of some noise in the topic modelling.
## Original Dataset
https://huggingface.co/datasets/Cohere/wikipedia-22-12
## Topic modelling
https://huggingface.co/MaartenGr/BERTopic_Wikipedia
Check the `med_topics.csv` in the git repo for more info on which topics where targeted by prompting `GPT3.5-turbo 0613` over word representations of topics. THe original topic list can be obtained from the topic model.
# Citation
\[TBD\] |
christy/imdb_embeddings | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.