datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AdapterOcean/python3-standardized_cluster_5 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 75645196
num_examples: 7730
download_size: 0
dataset_size: 75645196
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuxiao1468/simulation_car_dataset | ---
license: afl-3.0
---
|
Atipico1/mrqa-test-mid | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: clear_answer_sent
dtype: string
- name: vague_answer_sent
dtype: string
- name: adversary
dtype: string
- name: replace_count
dtype: int64
- name: adversarial_passage
dtype: string
- name: masked_answer_sent
dtype: string
- name: num_mask_token
dtype: int64
- name: entities
sequence: string
- name: gpt_adv_sent
dtype: string
- name: is_same
dtype: string
splits:
- name: train
num_bytes: 2046661
num_examples: 977
download_size: 1342844
dataset_size: 2046661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
InferencePrince555/Resume-Dataset | ---
license: apache-2.0
---
|
shredder-31/MCQ_Question_DataSets | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Answer
dtype: string
- name: Question
dtype: string
- name: Choices
dtype: string
splits:
- name: train
num_bytes: 180865039
num_examples: 111366
download_size: 111809572
dataset_size: 180865039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
--- |
open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500 | ---
pretty_name: Evaluation run of mncai/Llama2-7B-guanaco-dolphin-500
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Llama2-7B-guanaco-dolphin-500](https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T13:57:10.861825](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500/blob/main/results_2024-02-19T13-57-10.861825.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48789166502400916,\n\
\ \"acc_stderr\": 0.03456685332517279,\n \"acc_norm\": 0.49373129590075493,\n\
\ \"acc_norm_stderr\": 0.03535583848667483,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.4693286672182885,\n\
\ \"mc2_stderr\": 0.01543993120710859\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.014593487694937742,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6215893248356901,\n\
\ \"acc_stderr\": 0.00483999574560232,\n \"acc_norm\": 0.8161720772754432,\n\
\ \"acc_norm_stderr\": 0.003865521762363158\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n\
\ \"acc_stderr\": 0.02835863485983694,\n \"acc_norm\": 0.5387096774193548,\n\
\ \"acc_norm_stderr\": 0.02835863485983694\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934837,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934837\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6697247706422018,\n \"acc_stderr\": 0.020164466336342977,\n \"\
acc_norm\": 0.6697247706422018,\n \"acc_norm_stderr\": 0.020164466336342977\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n\
\ \"acc_stderr\": 0.01704124314349097,\n \"acc_norm\": 0.6513409961685823,\n\
\ \"acc_norm_stderr\": 0.01704124314349097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.02681771813034892,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.02681771813034892\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.0289473388516141,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.0289473388516141\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36766623207301175,\n\
\ \"acc_stderr\": 0.012314845910071695,\n \"acc_norm\": 0.36766623207301175,\n\
\ \"acc_norm_stderr\": 0.012314845910071695\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4722222222222222,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.4693286672182885,\n\
\ \"mc2_stderr\": 0.01543993120710859\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993372\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12736921910538287,\n \
\ \"acc_stderr\": 0.009183110326737829\n }\n}\n```"
repo_url: https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|arc:challenge|25_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T23_43_24.108245
path:
- '**/details_harness|drop|3_2023-10-25T23-43-24.108245.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T23-43-24.108245.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T23_43_24.108245
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-43-24.108245.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|gsm8k|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hellaswag|10_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T13-57-10.861825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-36-50.573905.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T13-57-10.861825.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T23_43_24.108245
path:
- '**/details_harness|winogrande|5_2023-10-25T23-43-24.108245.parquet'
- split: 2024_02_19T13_57_10.861825
path:
- '**/details_harness|winogrande|5_2024-02-19T13-57-10.861825.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T13-57-10.861825.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_36_50.573905
path:
- results_2023-10-03T19-36-50.573905.parquet
- split: 2023_10_25T23_43_24.108245
path:
- results_2023-10-25T23-43-24.108245.parquet
- split: 2024_02_19T13_57_10.861825
path:
- results_2024-02-19T13-57-10.861825.parquet
- split: latest
path:
- results_2024-02-19T13-57-10.861825.parquet
---
# Dataset Card for Evaluation run of mncai/Llama2-7B-guanaco-dolphin-500
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mncai/Llama2-7B-guanaco-dolphin-500](https://huggingface.co/mncai/Llama2-7B-guanaco-dolphin-500) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T13:57:10.861825](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-dolphin-500/blob/main/results_2024-02-19T13-57-10.861825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48789166502400916,
"acc_stderr": 0.03456685332517279,
"acc_norm": 0.49373129590075493,
"acc_norm_stderr": 0.03535583848667483,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.4693286672182885,
"mc2_stderr": 0.01543993120710859
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.014593487694937742,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182526
},
"harness|hellaswag|10": {
"acc": 0.6215893248356901,
"acc_stderr": 0.00483999574560232,
"acc_norm": 0.8161720772754432,
"acc_norm_stderr": 0.003865521762363158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.02835863485983694,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.02835863485983694
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934837,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342977,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.01704124314349097,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.01704124314349097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.02681771813034892,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.02681771813034892
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468636,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.0289473388516141,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.0289473388516141
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36766623207301175,
"acc_stderr": 0.012314845910071695,
"acc_norm": 0.36766623207301175,
"acc_norm_stderr": 0.012314845910071695
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.4693286672182885,
"mc2_stderr": 0.01543993120710859
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.012310515810993372
},
"harness|gsm8k|5": {
"acc": 0.12736921910538287,
"acc_stderr": 0.009183110326737829
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lloydmeta/drug_dataset_cleaned | ---
dataset_info:
features:
- name: patient_id
dtype: int64
- name: drugName
dtype: string
- name: condition
dtype: string
- name: review
dtype: string
- name: rating
dtype: float64
- name: date
dtype: string
- name: usefulCount
dtype: int64
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 65606437.22552815
num_examples: 110191
- name: validation
num_bytes: 16332471.085369816
num_examples: 27551
- name: test
num_bytes: 27300773.938145224
num_examples: 45890
download_size: 63535728
dataset_size: 109239682.24904318
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
zolak/twitter_dataset_1713018082 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: float64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 47798291
num_examples: 122970
download_size: 24157254
dataset_size: 47798291
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
elliotthwang/Evol-Instruct-Chinese-GPT4-0.5k | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 822751.2928571429
num_examples: 500
download_size: 524484
dataset_size: 822751.2928571429
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tamazight-NLP/FLORES-200-Tamasheq-Latin-Script | ---
license: cc-by-sa-4.0
task_categories:
- translation
- text2text-generation
language:
- en
- taq
- ber
annotations_creators:
- expert-generated
pretty_name: FLORES 200 (Tamasheq (Latin script))
size_categories:
- 1K<n<10K
--- |
Back-up/facebook_comment_dangerous_content | ---
dataset_info:
features:
- name: id
dtype: string
- name: _index
dtype: string
- name: _score
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 162791
num_examples: 1000
download_size: 90574
dataset_size: 162791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vitruv/math_report_analysis | ---
license: mit
dataset_info:
features:
- name: main_dict
struct:
- name: 반 이름
dtype: string
- name: 반 인원 소요 시간
sequence: int64
- name: 반 인원 획득 점수
sequence: int64
- name: 시험 참가 인원
dtype: int64
- name: 학교 이름
dtype: string
- name: 학교 전체 시간 평균
dtype: int64
- name: 학교 전체 점수 평균
dtype: int64
- name: 학생들 이름
sequence: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 18188942
num_examples: 8500
- name: val
num_bytes: 1065528
num_examples: 500
download_size: 3148315
dataset_size: 19254470
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
Asis41/YuGiCard | ---
license: unknown
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 833972.0
num_examples: 10
download_size: 743126
dataset_size: 833972.0
---
|
umm-maybe/wikisource-cthulhu-mythos | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3274623
num_examples: 8332
- name: test
num_bytes: 125309
num_examples: 376
download_size: 1854810
dataset_size: 3399932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ArmelRandy/kde4 | ---
dataset_info:
features:
- name: en
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 10583004
num_examples: 20058
download_size: 5304712
dataset_size: 10583004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "kde4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
souvenger/NLP2LinuxCommands | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 2518
num_examples: 48
- name: validation
num_bytes: 414
num_examples: 6
- name: test
num_bytes: 463
num_examples: 6
download_size: 6322
dataset_size: 3395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
wtcherr/unsplash_10k_canny | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1714585428.0
num_examples: 10000
download_size: 1713856025
dataset_size: 1714585428.0
---
# Dataset Card for "unsplash_10k_canny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amirulabu/majalah-com | ---
license: bsd
language:
- ms
- en
---
# Majalah.com scrapping post data
## Notes
separator is @@@~~~@@@ |
CyberHarem/irisviel_von_einzbern_fatekaleidlinerprismaillya | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Irisviel von Einzbern
This is the dataset of Irisviel von Einzbern, containing 93 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 93 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 220 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 93 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 93 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 93 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 93 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 93 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 220 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 220 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 220 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sam-mosaic/orca-gpt4-chatml | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1868875699
num_examples: 994896
download_size: 1050255655
dataset_size: 1868875699
---
# Dataset Card for "orca-gpt4-chatml"
As of 7/21/23, the [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) dataset has something malformed, causing a crash when you try to load it in `dataset`.
The GPT-4 data looks good though, so I preprocess it and push it up here in ChatML format. |
AnonymousSub/MedQuAD_Context_Question_Answer_Triples_TWO | ---
dataset_info:
features:
- name: Contexts
dtype: string
- name: Questions
dtype: string
- name: Answers
dtype: string
splits:
- name: train
num_bytes: 190839732
num_examples: 47441
download_size: 21760499
dataset_size: 190839732
---
# Dataset Card for "MedQuAD_Context_Question_Answer_Triples_TWO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-libri-one-sec-cv12/chunk_4 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: logits
sequence: float32
splits:
- name: train
num_bytes: 273336099.7677486
num_examples: 8526
download_size: 216103721
dataset_size: 273336099.7677486
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NeuralNovel__Mistral-7B-Instruct-v0.2-Neural-Story | ---
pretty_name: Evaluation run of NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story](https://huggingface.co/NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Mistral-7B-Instruct-v0.2-Neural-Story\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T18:54:03.241759](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Mistral-7B-Instruct-v0.2-Neural-Story/blob/main/results_2023-12-30T18-54-03.241759.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6060791614560437,\n\
\ \"acc_stderr\": 0.033211145729547226,\n \"acc_norm\": 0.6111814809313938,\n\
\ \"acc_norm_stderr\": 0.033882148359148205,\n \"mc1\": 0.5201958384332925,\n\
\ \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6689337299841565,\n\
\ \"mc2_stderr\": 0.015285957609493764\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6589324835690101,\n\
\ \"acc_stderr\": 0.0047309913571942945,\n \"acc_norm\": 0.8396733718382793,\n\
\ \"acc_norm_stderr\": 0.0036615885079775462\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956914,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956914\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562413,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562413\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723882,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723882\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228395,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228395\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501602,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501602\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175371,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934486,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.02970528405677243,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.02970528405677243\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5201958384332925,\n\
\ \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6689337299841565,\n\
\ \"mc2_stderr\": 0.015285957609493764\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38286580742987114,\n \
\ \"acc_stderr\": 0.013389223491820463\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|arc:challenge|25_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|gsm8k|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hellaswag|10_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-54-03.241759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T18-54-03.241759.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- '**/details_harness|winogrande|5_2023-12-30T18-54-03.241759.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T18-54-03.241759.parquet'
- config_name: results
data_files:
- split: 2023_12_30T18_54_03.241759
path:
- results_2023-12-30T18-54-03.241759.parquet
- split: latest
path:
- results_2023-12-30T18-54-03.241759.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story](https://huggingface.co/NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Mistral-7B-Instruct-v0.2-Neural-Story",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T18:54:03.241759](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Mistral-7B-Instruct-v0.2-Neural-Story/blob/main/results_2023-12-30T18-54-03.241759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6060791614560437,
"acc_stderr": 0.033211145729547226,
"acc_norm": 0.6111814809313938,
"acc_norm_stderr": 0.033882148359148205,
"mc1": 0.5201958384332925,
"mc1_stderr": 0.017489216849737053,
"mc2": 0.6689337299841565,
"mc2_stderr": 0.015285957609493764
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839162
},
"harness|hellaswag|10": {
"acc": 0.6589324835690101,
"acc_stderr": 0.0047309913571942945,
"acc_norm": 0.8396733718382793,
"acc_norm_stderr": 0.0036615885079775462
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562413,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723882,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723882
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228395,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228395
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501602,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175371,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934486,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.02970528405677243,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.02970528405677243
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5201958384332925,
"mc1_stderr": 0.017489216849737053,
"mc2": 0.6689337299841565,
"mc2_stderr": 0.015285957609493764
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
},
"harness|gsm8k|5": {
"acc": 0.38286580742987114,
"acc_stderr": 0.013389223491820463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lumunot/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 298309
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
killah-t-cell/boxes_train_controlnet_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1365601206.528
num_examples: 26662
download_size: 1257141642
dataset_size: 1365601206.528
---
# Dataset Card for "boxes_train_controlnet_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tdh87/STORYDATASETV5 | ---
license: apache-2.0
---
|
ivanleomk/prompt_injection_password | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 142227
num_examples: 917
download_size: 53239
dataset_size: 142227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "prompt_injection_password"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jakelever/debug_drugprot2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: e1_indices
sequence: int64
- name: e2_indices
sequence: int64
- name: attention_mask
sequence: int8
- name: token_type_ids
sequence: int8
- name: label
sequence: float64
splits:
- name: train
num_bytes: 26601010
num_examples: 48689
- name: val
num_bytes: 6317892
num_examples: 12135
- name: test
num_bytes: 6595188
num_examples: 12621
download_size: 4223780
dataset_size: 39514090
---
# Dataset Card for "debug_drugprot2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeepLearner101/ImageNetSubset_2KI10C-Training | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 96672633.0
num_examples: 2560
- name: validation
num_bytes: 18659288.0
num_examples: 500
download_size: 115299725
dataset_size: 115331921.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
NbAiLab/norwegian-xsum | ---
language:
- 'no'
--- |
NickyNicky/synthetic_text_to_sql_format_chatML_gemma | ---
dataset_info:
features:
- name: Text
dtype: string
- name: len_token_text
dtype: int64
splits:
- name: train
num_bytes: 142483016
num_examples: 100000
- name: test
num_bytes: 8325593
num_examples: 5851
download_size: 44871261
dataset_size: 150808609
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: apache-2.0
language:
- en
---
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/641b435ba5f876fe30c5ae0a/bXI58O_0PXDmFGl8EeK7F.png" alt="" style="width: 60%; max-height: 550px;">
</p>
## dataset base.
```
gretelai/synthetic_text_to_sql
```
```py
dataset = load_dataset("NickyNicky/synthetic_text_to_sql_format_chatML_gemma")
```
```
<bos><start_of_turn>system
You are a helpful AI assistant.
you are a sql expert who responds in json format.<end_of_turn>
<start_of_turn>user
## prompt:
What is the total gold production by 'Site B' in the 'production' table?
## sql context:
CREATE TABLE production (id INT, site VARCHAR(50), year INT, gold_production INT, silver_production INT);<end_of_turn>
<start_of_turn>model
{
"sql_explanation": "The SQL query calculates the total gold production for 'Site B' by using the SUM function on the 'gold_production' column in the 'production' table, filtered by the 'site' column equal to 'Site B'.",
"sql_complexity_description": "basic SQL with a simple select statement",
"sql_complexity": "basic SQL",
"sql": "SELECT SUM(gold_production) FROM production WHERE site = 'Site B';",
"domain_description": "Mining operation data, environmental impact stats, workforce diversity, and resource depletion metrics.",
"sql_task_type_description": "generating reports, dashboards, and analytical insights",
"sql_task_type": "analytics and reporting",
"domain": "mining industry"
}<end_of_turn><eos>
```
```
<bos><start_of_turn>system
You are a helpful AI assistant.
you are a sql expert who responds in json format.<end_of_turn>
<start_of_turn>user
## prompt:
What is the total number of ground vehicles sold by GlobalTechDefense in the Middle East?
## sql context:
CREATE TABLE IF NOT EXISTS sales (equipment_type varchar(20), quantity_sold int, company varchar(30), region varchar(30));
INSERT INTO sales (equipment_type, quantity_sold, company, region) VALUES ('tank', 150, 'GlobalTechDefense', 'Middle East'), ('jeep', 75, 'GlobalTechDefense', 'Middle East'), ('truck', 120, 'GlobalTechDefense', 'Middle East');<end_of_turn>
<start_of_turn>model
{
"sql_explanation": "The SQL query calculates the total number of ground vehicles sold by GlobalTechDefense in the Middle East by summing the quantity_sold column. It filters for the relevant equipment_type, company, and region.",
"sql_complexity_description": "basic SQL with a simple select statement",
"sql_complexity": "basic SQL",
"sql": "SELECT SUM(quantity_sold) FROM sales WHERE equipment_type = 'ground vehicles' AND company = 'GlobalTechDefense' AND region = 'Middle East';",
"domain_description": "Military equipment sales data, defense project timelines, contract negotiations, and geopolitical risk assessments.",
"sql_task_type_description": "generating reports, dashboards, and analytical insights",
"sql_task_type": "analytics and reporting",
"domain": "defense contractors"
}<end_of_turn><eos>
```
<!-- https://colab.research.google.com/drive/11nF88nFHe-zHNxZf-aFpwASj6Txe979c#scrollTo=tF-dM-wFjDJN --> |
Ti-Ma/wikipedia_2014 | ---
license: cc-by-sa-3.0
---
|
Imrankhanjoya/ecom-con | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-34b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T02:11:34.305471](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf/blob/main/results_2024-02-19T02-11-34.305471.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4928401323448239,\n\
\ \"acc_stderr\": 0.03436384457050625,\n \"acc_norm\": 0.4959176673909788,\n\
\ \"acc_norm_stderr\": 0.03508471975074113,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.4137349714821021,\n\
\ \"mc2_stderr\": 0.014581913837481237\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.01458223646086698,\n\
\ \"acc_norm\": 0.5042662116040956,\n \"acc_norm_stderr\": 0.014610858923956955\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5615415255925115,\n\
\ \"acc_stderr\": 0.004951840978219683,\n \"acc_norm\": 0.7635929097789285,\n\
\ \"acc_norm_stderr\": 0.004240066898702514\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.02459497512892094,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02459497512892094\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.038254602783800246,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.038254602783800246\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6697247706422018,\n \"acc_stderr\": 0.020164466336342973,\n \"\
acc_norm\": 0.6697247706422018,\n \"acc_norm_stderr\": 0.020164466336342973\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6371308016877637,\n \"acc_stderr\": 0.031299208255302136,\n \
\ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.031299208255302136\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6564495530012772,\n\
\ \"acc_stderr\": 0.016982145632652466,\n \"acc_norm\": 0.6564495530012772,\n\
\ \"acc_norm_stderr\": 0.016982145632652466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583302,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583302\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35071707953063885,\n\
\ \"acc_stderr\": 0.012187773370741525,\n \"acc_norm\": 0.35071707953063885,\n\
\ \"acc_norm_stderr\": 0.012187773370741525\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500348,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500348\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43137254901960786,\n \"acc_stderr\": 0.02003639376835263,\n \
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.02003639376835263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.4137349714821021,\n\
\ \"mc2_stderr\": 0.014581913837481237\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875825\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32297194844579225,\n \
\ \"acc_stderr\": 0.01288036079485182\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-34b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|arc:challenge|25_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|arc:challenge|25_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|drop|3_2023-09-22T19-25-35.748901.parquet'
- split: 2023_10_16T00_56_20.013624
path:
- '**/details_harness|drop|3_2023-10-16T00-56-20.013624.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T00-56-20.013624.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-25-35.748901.parquet'
- split: 2023_10_16T00_56_20.013624
path:
- '**/details_harness|gsm8k|5_2023-10-16T00-56-20.013624.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|gsm8k|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hellaswag|10_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hellaswag|10_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-11-34.305471.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T20:08:27.081225.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T02-11-34.305471.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_25_35.748901
path:
- '**/details_harness|winogrande|5_2023-09-22T19-25-35.748901.parquet'
- split: 2023_10_16T00_56_20.013624
path:
- '**/details_harness|winogrande|5_2023-10-16T00-56-20.013624.parquet'
- split: 2024_02_19T02_11_34.305471
path:
- '**/details_harness|winogrande|5_2024-02-19T02-11-34.305471.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T02-11-34.305471.parquet'
- config_name: results
data_files:
- split: 2023_08_26T20_08_27.081225
path:
- results_2023-08-26T20:08:27.081225.parquet
- split: 2023_09_22T19_25_35.748901
path:
- results_2023-09-22T19-25-35.748901.parquet
- split: 2023_10_16T00_56_20.013624
path:
- results_2023-10-16T00-56-20.013624.parquet
- split: 2024_02_19T02_11_34.305471
path:
- results_2024-02-19T02-11-34.305471.parquet
- split: latest
path:
- results_2024-02-19T02-11-34.305471.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-34b-Python-hf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T02:11:34.305471](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Python-hf/blob/main/results_2024-02-19T02-11-34.305471.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4928401323448239,
"acc_stderr": 0.03436384457050625,
"acc_norm": 0.4959176673909788,
"acc_norm_stderr": 0.03508471975074113,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.4137349714821021,
"mc2_stderr": 0.014581913837481237
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.01458223646086698,
"acc_norm": 0.5042662116040956,
"acc_norm_stderr": 0.014610858923956955
},
"harness|hellaswag|10": {
"acc": 0.5615415255925115,
"acc_stderr": 0.004951840978219683,
"acc_norm": 0.7635929097789285,
"acc_norm_stderr": 0.004240066898702514
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02459497512892094,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02459497512892094
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.038254602783800246,
"acc_norm": 0.6,
"acc_norm_stderr": 0.038254602783800246
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.034812853382329624,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.034812853382329624
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342973,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342973
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.031299208255302136,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.031299208255302136
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6564495530012772,
"acc_stderr": 0.016982145632652466,
"acc_norm": 0.6564495530012772,
"acc_norm_stderr": 0.016982145632652466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583302,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583302
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35071707953063885,
"acc_stderr": 0.012187773370741525,
"acc_norm": 0.35071707953063885,
"acc_norm_stderr": 0.012187773370741525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500348,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500348
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.02003639376835263,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.02003639376835263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.4137349714821021,
"mc2_stderr": 0.014581913837481237
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875825
},
"harness|gsm8k|5": {
"acc": 0.32297194844579225,
"acc_stderr": 0.01288036079485182
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AfhamAhmed1/damask-patterns | ---
language:
- en
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 13168081.0
num_examples: 67
download_size: 13165839
dataset_size: 13168081.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mboth/waermeVerteilen-100-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Druckhaltestation
'1': HeizkreisAllgemein
'2': Heizkurve
'3': Kaeltemengenzaehler
'4': Pumpe
'5': Raum
'6': Regler
'7': Ruecklauf
'8': Uebertrager
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
'12': Warmwasserbereitung
splits:
- name: train
num_bytes: 216197.29691451695
num_examples: 1016
- name: test
num_bytes: 423002
num_examples: 1978
- name: valid
num_bytes: 423002
num_examples: 1978
download_size: 353233
dataset_size: 1062201.296914517
---
# Dataset Card for "waermeVerteilen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x70DA/sci_summ | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: validation
num_bytes: 23361937.97759879
num_examples: 4631
- name: test
num_bytes: 23487172.952651516
num_examples: 4665
- name: train
num_bytes: 176474272.610434
num_examples: 34083
download_size: 120216439
dataset_size: 223323383.54068428
---
# Dataset Card for "sci_summ"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_medicine-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6685
num_examples: 5
- name: test
num_bytes: 601115
num_examples: 173
download_size: 16063
dataset_size: 607800
---
# Dataset Card for "mmlu-college_medicine-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CreativeLang/vua20_metaphor | ---
license: cc-by-2.0
---
# VUA20
## Dataset Description
- **Paper:** [A Report on the 2020 VUA and TOEFL Metaphor Detection Shared Task](https://aclanthology.org/2020.figlang-1.3/)
### Dataset Summary
Creative Language Toolkit (CLTK) Metadata
- CL Type: Metaphor
- Task Type: detection
- Size: 200k
- Created time: 2020
VUA20 is (**perhaps**) the largest dataset of metaphor detection used in Figlang2020 workshop.
For the details of this dataset, we refer you to the release [paper](https://aclanthology.org/2020.figlang-1.3/).
The annotation method of VUA20 is elabrated in the paper of [MIP](https://www.tandfonline.com/doi/abs/10.1080/10926480709336752).
### Citation Information
If you find this dataset helpful, please cite:
```
@inproceedings{Leong2020ARO,
title={A Report on the 2020 VUA and TOEFL Metaphor Detection Shared Task},
author={Chee Wee Leong and Beata Beigman Klebanov and Chris Hamill and Egon W. Stemle and Rutuja Ubale and Xianyang Chen},
booktitle={FIGLANG},
year={2020}
}
```
### Contributions
If you have any queries, please open an issue or direct your queries to [mail](mailto:yucheng.li@surrey.ac.uk). |
japanese-asr/whisper_transcriptions.reazonspeech.large.wer_10.0 | ---
dataset_info:
config_name: large
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: whisper_transcript
sequence: int64
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 145659866932.2361
num_examples: 1042883
download_size: 143373138290
dataset_size: 145659866932.2361
configs:
- config_name: large
data_files:
- split: train
path: large/train-*
---
|
chenqile09/llama2-chinese-couplet-100k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 33921909.405820444
num_examples: 100000
- name: validation
num_bytes: 1358512
num_examples: 4000
download_size: 13630532
dataset_size: 35280421.405820444
---
# Dataset Card for "llama2-chinese-couplet-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
graphs-datasets/CSL | ---
license: mit
task_categories:
- graph-ml
---
# Dataset Card for CSK
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://github.com/graphdeeplearning/benchmarking-gnns)**
- **Paper:**: (see citation)
### Dataset Summary
The CSL dataset is a synthetic dataset, to test GNN expressivity.
### Supported Tasks and Leaderboards
`CSL` should be used for binary graph classification, on isomoprhism or not.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset_hf = load_dataset("graphs-datasets/<mydataset>")
# For the train set (replace by valid or test as needed)
dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]]
dataset_pg = DataLoader(dataset_pg_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| #graphs | 150 |
| average #nodes | 41.0 |
| average #edges | 164.0 |
### Data Fields
Each row of a given file is a graph, with:
- `node_feat` (list: #nodes x #node-features): nodes
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features
- `y` (list: #labels): contains the number of labels available to predict
- `num_nodes` (int): number of nodes of the graph
### Data Splits
This data is split. It comes from the PyGeometric version of the dataset.
## Additional Information
### Licensing Information
The dataset has been released under MIT license.
### Citation Information
```
@article{DBLP:journals/corr/abs-2003-00982,
author = {Vijay Prakash Dwivedi and
Chaitanya K. Joshi and
Thomas Laurent and
Yoshua Bengio and
Xavier Bresson},
title = {Benchmarking Graph Neural Networks},
journal = {CoRR},
volume = {abs/2003.00982},
year = {2020},
url = {https://arxiv.org/abs/2003.00982},
eprinttype = {arXiv},
eprint = {2003.00982},
timestamp = {Sat, 23 Jan 2021 01:14:30 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2003-00982.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
nicholasfalconi/imagenet | ---
license: apache-2.0
---
|
thanhdath/vietnamese-retrieval-v2 | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: query
dtype: string
- name: positive_passages
list:
- name: docid
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: negative_passages
list:
- name: docid
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 6347713728
num_examples: 574167
download_size: 2987843540
dataset_size: 6347713728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vietnamese-retrieval-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hlillemark/flores200_8_baseline | ---
dataset_info:
features:
- name: id
dtype: int32
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 7505410300
num_examples: 20480000
download_size: 5077247260
dataset_size: 7505410300
---
# Dataset Card for "flores200_8_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sayalishankar/replytextsdatasets | ---
license: mit
---
|
jordane95/procqa | ---
license: apache-2.0
---
|
freshpearYoon/v3_train_free_6 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15366996016
num_examples: 10000
download_size: 2378595355
dataset_size: 15366996016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
St4n/my_datasets | ---
license: unknown
language:
- en
--- |
open-llm-leaderboard/details_nlpguy__Lelantos-low-tune | ---
pretty_name: Evaluation run of nlpguy/Lelantos-low-tune
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/Lelantos-low-tune](https://huggingface.co/nlpguy/Lelantos-low-tune) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Lelantos-low-tune\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T20:37:11.486361](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Lelantos-low-tune/blob/main/results_2024-01-06T20-37-11.486361.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6447697519576503,\n\
\ \"acc_stderr\": 0.032166201051049795,\n \"acc_norm\": 0.6453578989917766,\n\
\ \"acc_norm_stderr\": 0.03281998402567921,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6133400726022729,\n\
\ \"mc2_stderr\": 0.015422148614448189\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756562,\n\
\ \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6766580362477594,\n\
\ \"acc_stderr\": 0.004667960519938637,\n \"acc_norm\": 0.8605855407289384,\n\
\ \"acc_norm_stderr\": 0.0034567060380547555\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\"\
: 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\"\
: 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6133400726022729,\n\
\ \"mc2_stderr\": 0.015422148614448189\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597223\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \
\ \"acc_stderr\": 0.012972465034361867\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/Lelantos-low-tune
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|arc:challenge|25_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|gsm8k|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hellaswag|10_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T20-37-11.486361.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T20-37-11.486361.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- '**/details_harness|winogrande|5_2024-01-06T20-37-11.486361.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T20-37-11.486361.parquet'
- config_name: results
data_files:
- split: 2024_01_06T20_37_11.486361
path:
- results_2024-01-06T20-37-11.486361.parquet
- split: latest
path:
- results_2024-01-06T20-37-11.486361.parquet
---
# Dataset Card for Evaluation run of nlpguy/Lelantos-low-tune
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Lelantos-low-tune](https://huggingface.co/nlpguy/Lelantos-low-tune) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Lelantos-low-tune",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T20:37:11.486361](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Lelantos-low-tune/blob/main/results_2024-01-06T20-37-11.486361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6447697519576503,
"acc_stderr": 0.032166201051049795,
"acc_norm": 0.6453578989917766,
"acc_norm_stderr": 0.03281998402567921,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6133400726022729,
"mc2_stderr": 0.015422148614448189
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756562,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6766580362477594,
"acc_stderr": 0.004667960519938637,
"acc_norm": 0.8605855407289384,
"acc_norm_stderr": 0.0034567060380547555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6133400726022729,
"mc2_stderr": 0.015422148614448189
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597223
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_maywell__PiVoT-MoE | ---
pretty_name: Evaluation run of maywell/PiVoT-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/PiVoT-MoE](https://huggingface.co/maywell/PiVoT-MoE) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-24T01:47:47.057722](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-MoE/blob/main/results_2023-12-24T01-47-47.057722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6069679478753063,\n\
\ \"acc_stderr\": 0.03311851757787681,\n \"acc_norm\": 0.6115463544343916,\n\
\ \"acc_norm_stderr\": 0.03378897575698116,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5463839311843238,\n\
\ \"mc2_stderr\": 0.016228712279771185\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735567,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175458\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6621190997809201,\n\
\ \"acc_stderr\": 0.004720210816162055,\n \"acc_norm\": 0.8351921927902808,\n\
\ \"acc_norm_stderr\": 0.003702487662126949\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357334,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153327,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153327\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n\
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597563,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597563\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183877,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5463839311843238,\n\
\ \"mc2_stderr\": 0.016228712279771185\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207402\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \
\ \"acc_stderr\": 0.013442502402794302\n }\n}\n```"
repo_url: https://huggingface.co/maywell/PiVoT-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|arc:challenge|25_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|gsm8k|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hellaswag|10_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T01-47-47.057722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- '**/details_harness|winogrande|5_2023-12-24T01-47-47.057722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-24T01-47-47.057722.parquet'
- config_name: results
data_files:
- split: 2023_12_24T01_47_47.057722
path:
- results_2023-12-24T01-47-47.057722.parquet
- split: latest
path:
- results_2023-12-24T01-47-47.057722.parquet
---
# Dataset Card for Evaluation run of maywell/PiVoT-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/PiVoT-MoE](https://huggingface.co/maywell/PiVoT-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T01:47:47.057722](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-MoE/blob/main/results_2023-12-24T01-47-47.057722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6069679478753063,
"acc_stderr": 0.03311851757787681,
"acc_norm": 0.6115463544343916,
"acc_norm_stderr": 0.03378897575698116,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5463839311843238,
"mc2_stderr": 0.016228712279771185
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735567,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.6621190997809201,
"acc_stderr": 0.004720210816162055,
"acc_norm": 0.8351921927902808,
"acc_norm_stderr": 0.003702487662126949
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357334,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153327,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153327
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597563,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597563
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183877,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5463839311843238,
"mc2_stderr": 0.016228712279771185
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207402
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
idiotDeveloper/KoreanTelephone_Mini_dataset_processed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 46829281128
num_examples: 48752
- name: test
num_bytes: 5853655240
num_examples: 6094
- name: valid
num_bytes: 5853670352
num_examples: 6094
download_size: 9112286935
dataset_size: 58536606720
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
mawadalla/scientific-figures-captions-xl | ---
task_categories:
- visual-question-answering
language:
- en
pretty_name: Scientific Figures and Captions
size_categories:
- 1M<n<10M
---
# Scientific Figures and Captions Dataset from research papers
This repository contains the Scientific Figures and Captions dataset, which includes approximately 4.2 million entries of scientific figures and their corresponding captions extracted from academic papers on arXiv. This dataset is intended for research purposes in the fields of computer vision and natural language processing, particularly for tasks related to image captioning and automated figure analysis.
## Dataset Description
The dataset is structured as a Parquet dataframe with two columns:
- `image_filename`: This column contains the relative paths to image files.
- `caption`: This column contains the textual captions associated with each image.
Images are stored under `dataset/figures/` and are compressed into multiple parts (.z01, .z02, ..., .z103) with a final `.zip` file that encompasses all parts. This format is used for efficiently handling large datasets.
## Extraction Instructions
To access the images, you must first decompress the multi-part ZIP archive. Make sure you have all parts of the archive (.z01 to .z103 and the .zip file) in the same directory. Most decompression tools will recognize and handle multi-part ZIP files seamlessly.
Here is an example using the command line with `unzip`:
```bash
# Navigate to the directory containing the compressed parts
cd dataset/figures
# Use unzip to extract the first set of images
unzip compressedfigures.zip
# combine the second set of images
cat compressedfigures_part2* > compressedfigures_part2.tar.gz
# unzip second set of images
tar xf compressedfigures_part2.tar.gz
# You're good to go!
```
This will extract the contents into the `dataset/figures/` directory. Ensure that you have enough storage space to accommodate the uncompressed images.
## Usage Example
To use the dataset in your Python projects, you'll need to read the Parquet file into a DataFrame. Here is an example using `pandas`:
```python
import pandas as pd
# Load the dataset into a DataFrame
df = pd.read_parquet('dataset.parquet')
# Display the first few entries
df.head()
```
Once the dataset is loaded, you can use it as follows:
```python
from PIL import Image
import matplotlib.pyplot as plt
# Example function to display an image with its caption
def show_image_with_caption(image_path, caption):
img = Image.open(image_path)
plt.imshow(img)
plt.title(caption)
plt.axis('off') # Hide the axis
plt.show()
# Display the first image and its caption
first_image_path = df.loc[0, 'image_filename']
first_caption = df.loc[0, 'caption']
show_image_with_caption(first_image_path, first_caption)
```
## Acknowledgment
Special thanks to arxiv for providing access to all of the research papers. |
thanhnamitit/alpaca_translate_GPT_35_10_20k | ---
license: unknown
---
|
open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca | ---
pretty_name: Evaluation run of Open-Orca/Mistral-7B-SlimOrca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Open-Orca/Mistral-7B-SlimOrca](https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T04:55:17.464867](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca/blob/main/results_2023-10-24T04-55-17.464867.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03460570469798658,\n\
\ \"em_stderr\": 0.0018718276753995743,\n \"f1\": 0.11197776845637529,\n\
\ \"f1_stderr\": 0.002382569794079873,\n \"acc\": 0.4940341305179057,\n\
\ \"acc_stderr\": 0.011521340479768794\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03460570469798658,\n \"em_stderr\": 0.0018718276753995743,\n\
\ \"f1\": 0.11197776845637529,\n \"f1_stderr\": 0.002382569794079873\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2137983320697498,\n \
\ \"acc_stderr\": 0.011293054698635044\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902543\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|drop|3_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T04-55-17.464867.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|gsm8k|5_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T04-55-17.464867.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|winogrande|5_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T04-55-17.464867.parquet'
- config_name: results
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- results_2023-10-11T03-20-03.477959.parquet
- split: 2023_10_24T04_55_17.464867
path:
- results_2023-10-24T04-55-17.464867.parquet
- split: latest
path:
- results_2023-10-24T04-55-17.464867.parquet
---
# Dataset Card for Evaluation run of Open-Orca/Mistral-7B-SlimOrca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/Mistral-7B-SlimOrca](https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T04:55:17.464867](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca/blob/main/results_2023-10-24T04-55-17.464867.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995743,
"f1": 0.11197776845637529,
"f1_stderr": 0.002382569794079873,
"acc": 0.4940341305179057,
"acc_stderr": 0.011521340479768794
},
"harness|drop|3": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995743,
"f1": 0.11197776845637529,
"f1_stderr": 0.002382569794079873
},
"harness|gsm8k|5": {
"acc": 0.2137983320697498,
"acc_stderr": 0.011293054698635044
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902543
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Back-up/test_qa | ---
dataset_info:
features:
- name: Questions
dtype: string
- name: Answers
dtype: string
splits:
- name: train
num_bytes: 13640
num_examples: 18
download_size: 14990
dataset_size: 13640
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/1000_People_German_Handwriting_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
1,000 People - German Handwriting OCR Data. The writers are Europeans who often write German. The device is scanner, the collection angle is eye-level angle. The dataset content includes address, company name, personal name.The dataset can be used for tasks such as German handwriting OCR.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1404?source=Huggingface
## Data size
1,000 people, each subject collects 14 images
## Population distribution
gender distribution: 440 males, 560 females; age distribution: 27 people under 18 years old, 887 people from 18 to 45 years old, 60 people from 46 to 60 years old, 26 people over 60 years old
## Writer
Europeans who often write German
## Collecting environment
pure color background
## Device
scanner
## Photographic angle
eye-level angle
## Data format
the image data format is .png
## Data content
including address, company name and personal name, each image has 20 writing boxes
## Accuracy rate
the collection content accuracy is not less than 97%
# Licensing Information
Commercial License
|
polinaeterna/tmp | ---
dataset_info:
features:
- name: a
dtype: int64
splits:
- name: train
num_bytes: 16
num_examples: 2
download_size: 829
dataset_size: 16
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
version-control/ds-lib-version-10k | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: versions
list:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
splits:
- name: train
num_bytes: 875267
num_examples: 9374
download_size: 325936
dataset_size: 875267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bethgelab/simplegeometry | ---
license: cc-by-4.0
---
# SimpleGeometry
This repository contains simplified code for the paper:
**Wu's Method can Boost Symbolic AI to Rival Silver Medalists and AlphaGeometry to Outperform Gold Medalists at IMO Geometry, Arxiv**</br>
Shiven Sinha\*, Ameya Prabhu\*, Ponnurangam Kumaraguru, Siddharth Bhat<sup>+</sup>, Matthias Bethge<sup>+</sup></br>
[[PDF](https://arxiv.org/abs/2404.06405)]
[[Slides]()]
[[Bibtex](https://huggingface.co/datasets/bethgelab/simplegeometry/#citation)]
To run either of Wu's, Full-Angle, or Deductive Database provers implemented in JGEX,
- Download [JGEX](https://github.com/kovzol/Java-Geometry-Expert)
- Import any of problems files. The diagram should be automatically constructed once this is done.
- Pick your prover from the top right corner of the left pane and click on the prove button.
In the case of 2002 P2A and P2B, you need to launch a separate JGEX instance and recreate the diagram from scratch using the provided file as reference.
The proof process must be performed on this new instance.
We did not require changing the default memory limits to produce the results in the paper. However, you can additionally build JGEX from source to increase the memory limits for provers. Follow the instructions available in JGEX's documentation, and before running `./gradlew run`, (tweak and) paste the following line of code in the terminal to the memory limit to 10GB.
```bash
export _JAVA_OPTIONS="-Xmx10g"
```
#### If you discover any bugs in the code please contact me, I will cross-check them with my nightmares.
# Citation
We hope SimpleGeometry provides strong baselines, and Wu's method is useful for your cool automated theorem prover! To cite our work:
```
@inproceedings{sinha2024wu,
title={Wu's Method can Boost Symbolic AI to Rival Silver Medalists and AlphaGeometry to Outperform Gold Medalists at IMO Geometry},
author={Sinha, Shiven and Prabhu, Ameya and Kumaraguru, Ponnurangam and Bhat, Siddharth and Bethge, Matthias},
booktitle={arXiv preprint arXiv:2404.06405},
month={April},
year={2024}
}
```
|
CyberHarem/nidalee_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nidalee (League of Legends)
This is the dataset of nidalee (League of Legends), containing 42 images and their tags.
The core tags of this character are `long_hair, breasts, ponytail, green_eyes, large_breasts, brown_hair, dark_skin, facial_mark, dark-skinned_female, black_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 58.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nidalee_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 35.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nidalee_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 98 | 70.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nidalee_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 54.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nidalee_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 98 | 96.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nidalee_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nidalee_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, cleavage, navel, midriff, necklace, spear, tribal, very_long_hair, bare_shoulders |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | navel | midriff | necklace | spear | tribal | very_long_hair | bare_shoulders |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------|:----------|:-----------|:--------|:---------|:-----------------|:-----------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
Logic123456789/luotuoQA-B | ---
extra_gated_prompt: 我们制作了luotuo-QA-B数据集,请仔细阅读Licensing Information部分的信息。
extra_gated_heading: "您需要接受协议并提交信息以获取此数据集"
extra_gated_fields:
姓名: text
邮箱: text
所在组织: text
使用目的: text
我同意仅将此数据集用于非商业用途: checkbox
extra_gated_button_content: "我已阅读协议并同意提供相关信息"
license: other
task_categories:
- question-answering
language:
- zh
- en
---
# Dataset Card for luotuo-QA-B
## Dataset Description
- **Homepage:** https://github.com/LC1332/Luotuo-Chinese-LLM
- **Repository:** https://github.com/LC1332/Luotuo-QA
- **Point of Contact:** qinyu_luo@163.com
### Dataset Summary
Anki_Card是一种用于记忆和学习的电子卡片系统。我们建立了一个类似于这种形式的问答数据集,旨在推动中英文语境下问答模型的研究和发展。
我们的数据集是在3个开源数据集之上生成构建的,这3个数据集分别是:
·Chinese Scientific Literature Dataset
·CNN-DailyMail News Text Summarization
·arXiv Dataset
您可以直接搜索这些原始数据集的名称或是从以下链接访问它们
·https://github.com/ydli-ai/CSL
·https://www.kaggle.com/datasets/gowrishankarp/newspaper-text-summarization-cnn-dailymail
·https://www.kaggle.com/datasets/Cornell-University/arxiv
我们在这些数据集的基础上针对每一个摘要或新闻生成了5个“问题-答案”对。数据分布如下:
---从Chinese Scientific Literature Dataset(CSL)数据集中生成了25836条中文数据,共129180个问答对。
---从CNN-DailyMail News Text Summarization数据集中生成了2026条数据,共10130个问答对。
---从arXiv Dataset数据集中生成了3602条英文数据,共18010个问答对。
此外,由于此数据集是我们Luotuo-QA项目的一部分,我们将它叫做luotuo-QA-B。
您可以在这里查看Luotuo-QA项目:https://github.com/LC1332/Luotuo-QA
此数据集适用于训练和评估中文对话式问答模型。有益于推动中文自然语言处理领域的发展,同时也为研究人员和开发者提供了一个基准,用于比较不同模型的性能和探索新的方法。
我们希望这一工作能够促进全球范围内中文语境对话式问答任务的研究和进一步的创新。
-----------------------------------------------------------------------------------------------------------------------------------------------
Anki_Card is an electronic flashcard system used for memory and learning. We have created a question-and-answer dataset in a similar format to facilitate research and development of question-answering models in both Chinese and English contexts.
Our dataset is constructed based on three open-source datasets:
·Chinese Scientific Literature Dataset
·CNN-DailyMail News Text Summarization
·arXiv Dataset
You can directly search for the names of these original datasets or access them from the following links:
·Chinese Scientific Literature Dataset (CSL): https://github.com/ydli-ai/CSL
·CNN-DailyMail News Text Summarization: https://www.kaggle.com/datasets/gowrishankarp/newspaper-text-summarization-cnn-dailymail
·arXiv Dataset: https://www.kaggle.com/datasets/Cornell-University/arxiv
Based on these datasets, we have generated five "question-answer" pairs for each summary or news article. The data distribution is as follows:
---From the Chinese Scientific Literature Dataset (CSL), we generated 25,836 Chinese data points, resulting in a total of 129,180 question-answer pairs.
---From the CNN-DailyMail News Text Summarization dataset, we generated 2,026 data points, resulting in a total of 10,130 question-answer pairs.
---From the arXiv Dataset, we generated 3,602 English data points, resulting in a total of 18,010 question-answer pairs.
Furthermore, as this dataset is part of our Luotuo-QA project, we refer to it as luotuo-QA-B.
You can find the Luotuo-QA project here: https://github.com/LC1332/Luotuo-QA
This dataset is suitable for training and evaluating Chinese conversational question-answering models. It contributes to the development of Chinese natural language processing and provides researchers and developers with a benchmark for comparing the performance of different models and exploring new approaches.
We hope that this work will promote research and further innovation in Chinese conversational question-answering tasks on a global scale.
### Languages
CHINESE, ENGLISH
### Data Instances
中文数据样例:
```
{
"story": "针对已有神经网络功放建模的建模精度不高,易陷入局部极值等问题,提出一种新的改进并行粒子群算法(Improved Parallel Particle Swarm Optimization,IPPSO)。该算法在并行粒子群算法的基础上引入自适应变异操作,防止陷入局部最优;在微粒的速度项中加入整体微粒群的全局最优位置,动态调节学习因子与线性递减惯性权重,加快微粒收敛。将该改进算法用于优化RBF神经网络参数,并用优化的网络对非线性功放进行建模仿真。结果表明,该算法能有效减小建模误差,且均方根误差提高19.08%,进一步提高了神经网络功放建模精度。",
"QA": [
{
"question": "什么是IPPSO算法?",
"answer": "IPPSO是一种改进的并行粒子群算法,引入自适应变异操作,防止陷入局部最优;在微粒的速度项中加入整体微粒群的全局最优位置,动态调节学习因子与线性递减惯性权重,加快微粒收敛。"
},
{
"question": "IPPSO算法用于什么?",
"answer": "IPPSO算法用于优化RBF神经网络参数,并用优化的网络对非线性功放进行建模仿真。"
},
{
"question": "IPPSO算法的优点是什么?",
"answer": "IPPSO算法能有效减小建模误差,且均方根误差提高19.08%,进一步提高了神经网络功放建模精度。"
},
{
"question": "为什么需要改进并行粒子群算法?",
"answer": "因为已有的神经网络功放建模的建模精度不高,易陷入局部极值等问题。"
},
{
"question": "IPPSO算法如何防止陷入局部最优?",
"answer": "IPPSO算法引入自适应变异操作,防止陷入局部最优;在微粒的速度项中加入整体微粒群的全局最优位置,动态调节学习因子与线性递减惯性权重,加快微粒收敛。"
}
]
}
```
英文数据样例:
```
{
"story": "We discuss an alternative non-perturbative proof of Bertrand's theorem that leads in a concise way directly to the two allowed fields: the newtonian and the isotropic harmonic oscillator central fields.",
"QA": [
{
"question": "What is Bertrand's theorem?",
"answer": "It is a theorem that leads to the two allowed fields: the newtonian and the isotropic harmonic oscillator central fields."
},
{
"question": "What is the proof of Bertrand's theorem like?",
"answer": "It is an alternative non-perturbative proof."
},
{
"question": "How many central fields are allowed by Bertrand's theorem?",
"answer": "Two central fields are allowed: the newtonian and the isotropic harmonic oscillator."
},
{
"question": "What is the advantage of the proof discussed in the text?",
"answer": "It leads directly to the allowed fields in a concise way."
},
{
"question": "What are the two allowed fields according to Bertrand's theorem?",
"answer": "The newtonian and the isotropic harmonic oscillator central fields."
}
]
}
```
### Licensing Information
我们的协议与三个原始数据集的协议保持一致,请阅读以下内容。
·CSL数据集的协议是Apache License 2.0,除非遵守许可证,否则您不得使用此文件
·CNN-DailyMail News Text Summarization数据集的协议是 CC0: Public Domain
·arXiv数据集的协议是 CC0: Public Domain
-----------------------------------------------------------------------------------------------------------------------------------------------
Our agreements are consistent with the agreements of three original datasets. Please read the following information.
· The protocol for the CSL dataset is Apache License 2.0. You are not allowed to use this file unless you comply with the license.
· The protocol for the CNN-DailyMail News Text Summarization dataset is CC0: Public Domain.
· The protocol for the arXiv dataset is CC0: Public Domain.
### Citation Information
如果您在项目中使用了我们的模型、代码或者数据,请引用我们。
Please cite us if you use the data or code in this repo.
```bibtex
@misc{alpaca,
author={Jianshen Liao, Ao Sun, Qinyu Luo, Hongsen Huang, Cheng Li},
title = {Luotuo-QA: Better Conversational Question Answering Model with Answer Completion},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/LC1332/Luotuo-QA}},
}
```
|
llizhx/sampled_coco2014_val | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: image
dtype: image
- name: question
dtype: string
- name: answer
sequence: string
- name: id
dtype: int64
- name: license
dtype: int8
- name: file_name
dtype: string
- name: coco_url
dtype: string
- name: height
dtype: int32
- name: width
dtype: int32
- name: date_captured
dtype: string
splits:
- name: val
num_bytes: 164399316.95635
num_examples: 1000
download_size: 167343849
dataset_size: 164399316.95635
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
---
|
Nerfgun3/iskou_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/iskou_style/resolve/main/iskou_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Iskou Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/iskou_style/resolve/main/iskou_showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"iskou_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(iskou_style:0.8)"```
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
liuyanchen1015/MULTI_VALUE_qqp_relativizer_doubling | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 496425
num_examples: 2407
- name: test
num_bytes: 4904516
num_examples: 24140
- name: train
num_bytes: 4515480
num_examples: 21763
download_size: 6010527
dataset_size: 9916421
---
# Dataset Card for "MULTI_VALUE_qqp_relativizer_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/bus_few4_128x_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 502551
num_examples: 3628
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 580069
---
# Dataset Card for "bus_few4_128x_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mila-intel/ProtST-BetaLactamase | ---
configs:
- config_name: default
data_files:
- split: train
path: beta_lactamase_train.csv
- split: validation
path: beta_lactamase_valid.csv
- split: test
path: beta_lactamase_test.csv
--- |
sudeepag/sampled-dialog_fsopt_data | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: _template_idx
dtype: int64
- name: _task_source
dtype: string
- name: _task_name
dtype: string
- name: _template_type
dtype: string
splits:
- name: train
num_bytes: 245288664.3438103
num_examples: 81376
download_size: 139440863
dataset_size: 245288664.3438103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NLPC-UOM/nllb-top25k-enta-cleaned | ---
license: odc-by
task_categories:
- translation
language:
- en
- ta
size_categories:
- 10K<n<100K
---
### Licensing Information
The dataset is released under the terms of [ODC-BY](https://opendatacommons.org/licenses/by/1-0/). By using this, you are also bound to the respective Terms of Use and License of the original source.
### Citation Information
Schwenk et al, CCMatrix: Mining Billions of High-Quality Parallel Sentences on the Web. ACL https://aclanthology.org/2021.acl-long.507/
Hefferman et al, Bitext Mining Using Distilled Sentence Representations for Low-Resource Languages. Arxiv https://arxiv.org/abs/2205.12654, 2022.<br>
NLLB Team et al, No Language Left Behind: Scaling Human-Centered Machine Translation, Arxiv https://arxiv.org/abs/2207.04672, 2022.
### Contributions
We thank the NLLB Meta AI team for open sourcing the meta data and instructions on how to use it with special thanks to Bapi Akula, Pierre Andrews, Onur Çelebi, Sergey Edunov, Kenneth Heafield, Philipp Koehn, Alex Mourachko, Safiyyah Saleem, Holger Schwenk, and Guillaume Wenzek. We also thank the AllenNLP team at AI2 for hosting and releasing this data, including Akshita Bhagia (for engineering efforts to host the data, and create the huggingface dataset), and Jesse Dodge (for organizing the connection). |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f936b644 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1339
dataset_size: 182
---
# Dataset Card for "f936b644"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DolphinNie/dungeon-dataset | ---
license: mit
tags:
- code
size_categories:
- 10K<n<100K
---
# Brogue Map Dataset
To clone this repo, use:
```
git clone https://huggingface.co/datasets/DolphinNie/dungeon-dataset
```
## 1. Data Explanation
This is the Map dataset from the open-sourced game [Brogue](https://github.com/tmewett/BrogueCE). It contains 40,000 train dataset, 10,000 test dataset and 10,000 validation dataset.
Each map is stored in a `.csv` file. The map is a `(32x32)` array, which is the map size.
Each cell in the array is a `int` number ranged from 0 to 13, which represented 14 tiles.
```json
"G_NONE": 0,
"G_GROUND": 1,
"G_SAND": 2,
"G_WATER": 3,
"G_BOG": 4,
"G_LAVA": 5,
"G_ICE": 6,
"G_GRASS": 7,
"G_FUNGUS": 8,
"G_ASHES": 9,
"G_STONE": 10,
"G_CRYSTAL": 11,
"G_FIRE": 12,
"G_BRIDGE": 13
```
An example map datapoint is in the format of
```
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,1,1,1,8,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,1,1,1,8,8,8,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,1,1,8,8,8,8,0,0,0,1,1,1,1,1,0,1,0,0,0,0,0,0,0,0,0,0,0
0,1,1,1,8,8,0,0,0,1,1,1,1,1,1,1,1,0,0,1,0,0,0,0,0,0,0,0
0,1,1,1,1,1,0,0,0,1,1,1,1,1,1,1,1,0,1,8,0,0,1,1,1,1,0,0
0,1,1,1,1,1,0,0,0,1,1,1,1,1,1,1,0,0,1,1,1,1,1,1,1,1,1,0
0,0,1,1,1,0,0,0,0,0,1,1,1,1,1,0,0,0,1,1,1,1,1,1,1,1,1,0
0,0,0,0,1,0,0,0,0,0,0,1,1,1,0,0,0,0,1,1,1,1,0,0,1,1,1,9
0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,1,1,0
0,0,1,1,1,1,1,0,0,0,0,0,0,1,1,1,1,0,0,0,1,0,0,0,0,0,0,0
0,0,1,1,1,1,1,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0
0,1,1,1,1,1,1,0,0,0,0,0,0,1,1,1,1,0,1,1,1,1,0,0,0,0,0,0
0,1,8,1,1,1,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0
0,1,8,8,8,8,1,8,0,0,0,0,1,8,1,1,0,0,0,0,0,0,0,0,0,1,1,1
0,0,8,8,8,8,8,8,0,0,0,8,8,8,8,8,1,0,0,0,1,1,0,0,0,1,1,1
0,0,1,8,8,8,8,8,8,0,1,8,8,8,8,8,1,0,0,0,1,1,0,0,0,0,1,1
0,0,0,1,8,8,8,8,8,0,1,1,1,8,8,1,0,0,0,0,1,1,0,1,0,1,1,1
0,0,0,8,8,8,8,8,8,1,1,1,1,8,1,1,0,0,0,0,1,1,1,1,0,1,1,0
0,0,0,8,8,8,8,1,0,0,0,3,1,0,1,0,0,0,0,0,0,1,1,1,0,1,1,0
0,0,0,0,8,8,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0
0,0,0,1,1,0,0,0,0,0,0,0,0,0,11,1,1,1,1,1,1,1,1,1,1,1,0,0
0,1,1,1,8,1,0,0,0,0,0,0,0,0,11,11,11,1,1,1,1,1,1,1,1,1,1,0
0,0,1,1,1,1,0,0,0,0,0,0,0,0,11,11,0,0,1,1,0,0,1,1,1,1,1,1
0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
```
## 2. Data processing
Huggingface does not store the map data in the correct format. To get each correct map data, use the following code:
```python
from datasets import load_dataset
import numpy as np
import matplotlib.pyplot as plt
# Load dataset from hugging face
dataset = load_dataset("DolphinNie/dungeon-dataset")
# Dataset stored by huggingface is not in a correct format, we need to do further process
def dataset_convert(dataset):
dataset_train = list()
dataset_test = list()
dataset_valid = list()
datasets = [dataset_train, dataset_test, dataset_valid]
name = ['train', 'test', 'validation']
for i in range(3):
datapoint_num = int(dataset[name[i]].num_rows / 32)
dataset_tf = dataset[name[i]].to_pandas()
for n in range(datapoint_num):
env_num = tf_train[n*32:(n+1)*32]
datasets[i].append(env_num)
return dataset_train, dataset_test, dataset_valid
dataset_train, dataset_test, dataset_valid = dataset_convert(dataset)
# Visualize the datapoints if you want
def visualize_map(dungeon_map):
plt.imshow(dungeon_map, cmap='viridis', interpolation='nearest')
plt.title('dungeon map')
plt.show()
visualize_map(dataset_train[10000])
```
<img src="./README.assets/image-20240411203604268.png" alt="image-20240411203604268" style="zoom:50%;" />
Note that this dataset contains a two-dimensional representation of the map, not a three-dimensional one-hot representation. If you need to train a new model, you need to further process the data set.
|
irds/beir_nq | ---
pretty_name: '`beir/nq`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/nq`
The `beir/nq` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/nq).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=2,681,468
- `queries` (i.e., topics); count=3,452
- `qrels`: (relevance assessments); count=4,201
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_nq', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'title': ...}
queries = load_dataset('irds/beir_nq', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_nq', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Kwiatkowski2019Nq,
title = {Natural Questions: a Benchmark for Question Answering Research},
author = {Tom Kwiatkowski and Jennimaria Palomaki and Olivia Redfield and Michael Collins and Ankur Parikh and Chris Alberti and Danielle Epstein and Illia Polosukhin and Matthew Kelcey and Jacob Devlin and Kenton Lee and Kristina N. Toutanova and Llion Jones and Ming-Wei Chang and Andrew Dai and Jakob Uszkoreit and Quoc Le and Slav Petrov},
year = {2019},
journal = {TACL}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
sudeepag/sampled-flan_zsnoopt_data | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: _template_idx
dtype: int64
- name: _task_source
dtype: string
- name: _task_name
dtype: string
- name: _template_type
dtype: string
splits:
- name: train
num_bytes: 1833812046.5052116
num_examples: 3738460
download_size: 1125318193
dataset_size: 1833812046.5052116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ihaflix1/cascao | ---
license: openrail
---
|
SyedAunZaidi/cv-corpus-10.0-ur | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: client_id
dtype: string
- name: path
dtype: string
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accents
dtype: string
- name: locale
dtype: string
- name: segment
dtype: float64
- name: config
dtype: string
splits:
- name: train
num_bytes: 108693654.24
num_examples: 4128
- name: test
num_bytes: 80422294.984
num_examples: 3267
- name: validation
num_bytes: 80422294.984
num_examples: 3267
download_size: 270016912
dataset_size: 269538244.20799994
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
ucalyptus/shrutilipi_bengali | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcriptions
dtype: string
splits:
- name: train
num_bytes: 78086461594.866
num_examples: 378691
download_size: 74356189780
dataset_size: 78086461594.866
---
# Dataset Card for "shrutilipi_bengali"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
olanigan/lp_transliterate | ---
license: apache-2.0
---
|
mattismegevand/IMSDb | ---
license: mit
task_categories:
- text-classification
- text-generation
- question-answering
- summarization
language:
- en
size_categories:
- 1K<n<10K
viewer: false
---
## IMSDb Scraper
A Python script that scrapes movie script details from the Internet Movie Script Database (IMSDb) website.
### Features:
- Fetches all script links available on IMSDb.
- Retrieves details for each movie script including:
- Title
- Poster Image URL
- IMSDb Opinion
- IMSDb Rating
- Average User Rating
- Writers
- Genres
- Script Date
- Movie Release Date
- Submitted By
- Full Script Text
### Installation
1. Clone repository.
2. Install the required Python packages.
```bash
pip install -r requirements.txt
```
### Usage
1. Simply run the script.
```bash
python scraper.py
```
2. After the script runs, the extracted movie script details will be saved to `data.jsonl`.
### Error Handling
If any errors occur during scraping, they will be written to `error.txt` with the respective URL and error message.
### Contributing
Feel free to open issues or PRs if you find any problems or have improvements in mind.
### License
This project is licensed under the MIT License. See `LICENSE` for more details. |
AdapterOcean/python3-standardized_cluster_13 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 64349243
num_examples: 5559
download_size: 0
dataset_size: 64349243
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rajpurkar/squad_v2 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
- extractive-qa
paperswithcode_id: squad
pretty_name: SQuAD2.0
dataset_info:
config_name: squad_v2
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 116732025
num_examples: 130319
- name: validation
num_bytes: 11661091
num_examples: 11873
download_size: 17720493
dataset_size: 128393116
configs:
- config_name: squad_v2
data_files:
- split: train
path: squad_v2/train-*
- split: validation
path: squad_v2/validation-*
default: true
train-eval-index:
- config: squad_v2
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: validation
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: squad_v2
name: SQuAD v2
---
# Dataset Card for SQuAD 2.0
## Table of Contents
- [Dataset Card for "squad_v2"](#dataset-card-for-squad_v2)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [squad_v2](#squad_v2)
- [Data Fields](#data-fields)
- [squad_v2](#squad_v2-1)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://rajpurkar.github.io/SQuAD-explorer/
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** https://arxiv.org/abs/1806.03822
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
SQuAD 2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers
to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but
also determine when no answer is supported by the paragraph and abstain from answering.
### Supported Tasks and Leaderboards
Question Answering.
### Languages
English (`en`).
## Dataset Structure
### Data Instances
#### squad_v2
- **Size of downloaded dataset files:** 46.49 MB
- **Size of the generated dataset:** 128.52 MB
- **Total amount of disk used:** 175.02 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [94, 87, 94, 94],
"text": ["10th and 11th centuries", "in the 10th and 11th centuries", "10th and 11th centuries", "10th and 11th centuries"]
},
"context": "\"The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10th and 11th centuries gave thei...",
"id": "56ddde6b9a695914005b9629",
"question": "When were the Normans in Normandy?",
"title": "Normans"
}
```
### Data Fields
The data fields are the same among all splits.
#### squad_v2
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train | validation |
| -------- | -----: | ---------: |
| squad_v2 | 130319 | 11873 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The dataset is distributed under the CC BY-SA 4.0 license.
### Citation Information
```
@inproceedings{rajpurkar-etal-2018-know,
title = "Know What You Don{'}t Know: Unanswerable Questions for {SQ}u{AD}",
author = "Rajpurkar, Pranav and
Jia, Robin and
Liang, Percy",
editor = "Gurevych, Iryna and
Miyao, Yusuke",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)",
month = jul,
year = "2018",
address = "Melbourne, Australia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P18-2124",
doi = "10.18653/v1/P18-2124",
pages = "784--789",
eprint={1806.03822},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{rajpurkar-etal-2016-squad,
title = "{SQ}u{AD}: 100,000+ Questions for Machine Comprehension of Text",
author = "Rajpurkar, Pranav and
Zhang, Jian and
Lopyrev, Konstantin and
Liang, Percy",
editor = "Su, Jian and
Duh, Kevin and
Carreras, Xavier",
booktitle = "Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2016",
address = "Austin, Texas",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D16-1264",
doi = "10.18653/v1/D16-1264",
pages = "2383--2392",
eprint={1606.05250},
archivePrefix={arXiv},
primaryClass={cs.CL},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@albertvillanova](https://github.com/albertvillanova), [@patrickvonplaten](https://github.com/patrickvonplaten), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
claytonsamples/movie_chat | ---
license: mit
---
|
ysr/rust-sft-training | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 14443200.771399386
num_examples: 13216
- name: test
num_bytes: 1605407.2286006128
num_examples: 1469
download_size: 6837641
dataset_size: 16048608.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Pelosl/Stuckinswing | ---
license: other
task_categories:
- text-classification
language:
- en
pretty_name: Stuck
size_categories:
- 1K<n<10K
--- |
michaelmallari/airbnb-ca-bc-vancouver | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-fe056b5c-7e36-4094-b3f2-84d1fbaaf77c-53 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
bertbsb/HerbeetHenrique | ---
license: openrail
---
|
mteb/biosses-sts | ---
language:
- en
--- |
kheopss/prompt_dataset_p27_reformulated_2 | ---
dataset_info:
features:
- name: response
dtype: string
- name: rewriten
dtype: string
splits:
- name: train
num_bytes: 209183
num_examples: 100
download_size: 88418
dataset_size: 209183
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-50000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1097811
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jinya425/Keyphrase_extraction | ---
license: mit
---
|
qwopqwop/danbooru2022-tags.csv | ---
license: mit
---
https://huggingface.co/datasets/qwopqwop/danbooru2022_tags |
Pisethan/avirut | ---
license: apache-2.0
---
|
HuggingFaceH4/instruction-pilot-outputs-greedy | ---
dataset_info:
features:
- name: id
dtype: int64
- name: source
dtype: string
- name: prompt
dtype: string
- name: outputs
list:
- name: model
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 243208
num_examples: 375
download_size: 100726
dataset_size: 243208
---
# Dataset Card for "instruction-pilot-outputs-greedy"
This dataset contains model outputs generated from the human demonstrations provided in [`HuggingFaceH4/instruction-pilot-prompts`](https://huggingface.co/datasets/HuggingFaceH4/instruction-pilot-prompts).
To convert each language model into a dialogue agent, we prepended the following [LangChain prompt](https://github.com/hwchase17/langchain/blob/bfabd1d5c0bf536fdd1e743e4db8341e7dfe82a9/langchain/chains/conversation/prompt.py#LL4C21-L9C7) to each input:
```
The following is a friendly conversation between a human and an AI. \
The AI is talkative and provides lots of specific details from its context. \
If the AI does not know the answer to a question, it truthfully says it does not know.
Human: {input}
AI:
```
For reproducibility purposes, we used deterministic text generation (`temperature=0`) and set `max_new_tokens=100` (which is about the mean lenght of the Self-Instruct outputs). |
mteb/AILA_statutes | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- https://zenodo.org/records/4063986
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 217
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 82
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 50
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
**AILA_statutes**
- Original link: https://zenodo.org/records/4063986
- This dataset is structured for the task of identifying the most relevant statutes for a given situation.
- The query set comprises 50 queries, each describing a specific situation.
- The corpus set consists of the title and description of statutes.
**Usage**
```
import datasets
# Download the dataset
queries = datasets.load_dataset("mteb/AILA_statutes", "queries")
documents = datasets.load_dataset("mteb/AILA_statutes", "corpus")
pair_labels = datasets.load_dataset("mteb/AILA_statutes", "default")
``` |
P22/beta-flower | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_Walmart-the-bag__WordWoven-13B | ---
pretty_name: Evaluation run of Walmart-the-bag/WordWoven-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Walmart-the-bag/WordWoven-13B](https://huggingface.co/Walmart-the-bag/WordWoven-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__WordWoven-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:04:01.998645](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__WordWoven-13B/blob/main/results_2024-01-04T14-04-01.998645.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6426229908497888,\n\
\ \"acc_stderr\": 0.032194876651038366,\n \"acc_norm\": 0.6446818377157361,\n\
\ \"acc_norm_stderr\": 0.032837192713734226,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5445170161036526,\n\
\ \"mc2_stderr\": 0.015492435025282279\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192603,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6697868950408286,\n\
\ \"acc_stderr\": 0.004693285694663838,\n \"acc_norm\": 0.8580959968133838,\n\
\ \"acc_norm_stderr\": 0.003482384956632779\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902796,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513405,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513405\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.01274823839736555,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.01274823839736555\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784586,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784586\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5445170161036526,\n\
\ \"mc2_stderr\": 0.015492435025282279\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.601213040181956,\n \
\ \"acc_stderr\": 0.013487360477060832\n }\n}\n```"
repo_url: https://huggingface.co/Walmart-the-bag/WordWoven-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-04-01.998645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-04-01.998645.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- '**/details_harness|winogrande|5_2024-01-04T14-04-01.998645.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-04-01.998645.parquet'
- config_name: results
data_files:
- split: 2024_01_04T14_04_01.998645
path:
- results_2024-01-04T14-04-01.998645.parquet
- split: latest
path:
- results_2024-01-04T14-04-01.998645.parquet
---
# Dataset Card for Evaluation run of Walmart-the-bag/WordWoven-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/WordWoven-13B](https://huggingface.co/Walmart-the-bag/WordWoven-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__WordWoven-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:04:01.998645](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__WordWoven-13B/blob/main/results_2024-01-04T14-04-01.998645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6426229908497888,
"acc_stderr": 0.032194876651038366,
"acc_norm": 0.6446818377157361,
"acc_norm_stderr": 0.032837192713734226,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5445170161036526,
"mc2_stderr": 0.015492435025282279
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192603,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.6697868950408286,
"acc_stderr": 0.004693285694663838,
"acc_norm": 0.8580959968133838,
"acc_norm_stderr": 0.003482384956632779
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513405,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513405
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.01274823839736555,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.01274823839736555
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784586,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784586
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5445170161036526,
"mc2_stderr": 0.015492435025282279
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710686
},
"harness|gsm8k|5": {
"acc": 0.601213040181956,
"acc_stderr": 0.013487360477060832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rondoniareal/vozcomercialgrave | ---
license: openrail
---
|
large_spanish_corpus | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- es
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 100M<n<1B
- 10K<n<100K
- 10M<n<100M
- 1M<n<10M
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: null
pretty_name: The Large Spanish Corpus
tags: []
dataset_info:
- config_name: JRC
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 380895504
num_examples: 3410620
download_size: 4099166669
dataset_size: 380895504
- config_name: EMEA
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 100259598
num_examples: 1221233
download_size: 4099166669
dataset_size: 100259598
- config_name: GlobalVoices
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 114435784
num_examples: 897075
download_size: 4099166669
dataset_size: 114435784
- config_name: ECB
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 336285757
num_examples: 1875738
download_size: 4099166669
dataset_size: 336285757
- config_name: DOGC
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 898279656
num_examples: 10917053
download_size: 4099166669
dataset_size: 898279656
- config_name: all_wikis
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3782280549
num_examples: 28109484
download_size: 4099166669
dataset_size: 3782280549
- config_name: TED
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15858148
num_examples: 157910
download_size: 4099166669
dataset_size: 15858148
- config_name: multiUN
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2327269369
num_examples: 13127490
download_size: 4099166669
dataset_size: 2327269369
- config_name: Europarl
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 359897865
num_examples: 2174141
download_size: 4099166669
dataset_size: 359897865
- config_name: NewsCommentary11
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 48350573
num_examples: 288771
download_size: 4099166669
dataset_size: 48350573
- config_name: UN
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 23654590
num_examples: 74067
download_size: 4099166669
dataset_size: 23654590
- config_name: EUBookShop
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1326861077
num_examples: 8214959
download_size: 4099166669
dataset_size: 1326861077
- config_name: ParaCrawl
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1840430234
num_examples: 15510649
download_size: 4099166669
dataset_size: 1840430234
- config_name: OpenSubtitles2018
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7477281776
num_examples: 213508602
download_size: 4099166669
dataset_size: 7477281776
- config_name: DGT
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 396217351
num_examples: 3168368
download_size: 4099166669
dataset_size: 396217351
- config_name: combined
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19428257807
num_examples: 302656160
download_size: 4099166669
dataset_size: 19428257807
config_names:
- DGT
- DOGC
- ECB
- EMEA
- EUBookShop
- Europarl
- GlobalVoices
- JRC
- NewsCommentary11
- OpenSubtitles2018
- ParaCrawl
- TED
- UN
- all_wikis
- combined
- multiUN
---
# Dataset Card for The Large Spanish Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/josecannete/spanish-corpora](https://github.com/josecannete/spanish-corpora)
- **Repository:** [https://github.com/josecannete/spanish-corpora](https://github.com/josecannete/spanish-corpora)
- **Paper:**
- **Data:** https://doi.org/10.5281/zenodo.3247731
- **Leaderboard:**
- **Point of Contact:** [José Cañete](mailto:jose.canete@ug.uchile.cl) (corpus creator) or [Lewis Tunstall](mailto:lewis.c.tunstall@gmail.com) (corpus submitter)
### Dataset Summary
The Large Spanish Corpus is a compilation of 15 unlabelled Spanish corpora spanning Wikipedia to European parliament notes. Each config contains the data corresponding to a different corpus. For example, `all_wiki` only includes examples from Spanish Wikipedia:
```python
from datasets import load_dataset
all_wiki = load_dataset('large_spanish_corpus', name='all_wiki')
```
By default, the config is set to "combined" which loads all the corpora.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Spanish
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
The following is taken from the corpus' source repsository:
* Spanish Wikis: Which include Wikipedia, Wikinews, Wikiquotes and more. These were first processed with wikiextractor (https://github.com/josecannete/wikiextractorforBERT) using the wikis dump of 20/04/2019.
* ParaCrawl: Spanish portion of ParaCrawl (http://opus.nlpl.eu/ParaCrawl.php)
* EUBookshop: Spanish portion of EUBookshop (http://opus.nlpl.eu/EUbookshop.php)
* MultiUN: Spanish portion of MultiUN (http://opus.nlpl.eu/MultiUN.php)
* OpenSubtitles: Spanish portion of OpenSubtitles2018 (http://opus.nlpl.eu/OpenSubtitles-v2018.php)
* DGC: Spanish portion of DGT (http://opus.nlpl.eu/DGT.php)
* DOGC: Spanish portion of DOGC (http://opus.nlpl.eu/DOGC.php)
* ECB: Spanish portion of ECB (http://opus.nlpl.eu/ECB.php)
* EMEA: Spanish portion of EMEA (http://opus.nlpl.eu/EMEA.php)
* Europarl: Spanish portion of Europarl (http://opus.nlpl.eu/Europarl.php)
* GlobalVoices: Spanish portion of GlobalVoices (http://opus.nlpl.eu/GlobalVoices.php)
* JRC: Spanish portion of JRC (http://opus.nlpl.eu/JRC-Acquis.php)
* News-Commentary11: Spanish portion of NCv11 (http://opus.nlpl.eu/News-Commentary-v11.php)
* TED: Spanish portion of TED (http://opus.nlpl.eu/TED2013.php)
* UN: Spanish portion of UN (http://opus.nlpl.eu/UN.php)
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@lewtun](https://github.com/lewtun) for adding this dataset. |
alvations/c4p0-v2-ja-en | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
- name: dataset
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
splits:
- name: train
num_bytes: 24906424
num_examples: 20164
download_size: 10803890
dataset_size: 24906424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yiqing07/data | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.