datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
wgarstka/test | ---
license: other
---
|
jahb57/bert_embeddings_BATCH_12 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
- name: pooler_output
sequence: float32
splits:
- name: train
num_bytes: 19700359751
num_examples: 100000
download_size: 19824797074
dataset_size: 19700359751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
elolelo/movie-corpus | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-ac4402f5-7985073 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- beans
eval_info:
task: image_multi_class_classification
model: karthiksv/vit-base-beans
metrics: []
dataset_name: beans
dataset_config: default
dataset_split: test
col_mapping:
image: image
target: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: karthiksv/vit-base-beans
* Dataset: beans
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
alvations/globalvoices-en-es | ---
dataset_info:
features:
- name: en
dtype: string
- name: es
dtype: string
splits:
- name: train
num_bytes: 89033765
num_examples: 355136
download_size: 57678468
dataset_size: 89033765
---
# Dataset Card for "globalvoices-en-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abacusai__MM-OV-bagel-DPO-34b-c1000-250 | ---
pretty_name: Evaluation run of abacusai/MM-OV-bagel-DPO-34b-c1000-250
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/MM-OV-bagel-DPO-34b-c1000-250](https://huggingface.co/abacusai/MM-OV-bagel-DPO-34b-c1000-250)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__MM-OV-bagel-DPO-34b-c1000-250\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T07:59:43.945933](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MM-OV-bagel-DPO-34b-c1000-250/blob/main/results_2024-01-24T07-59-43.945933.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7597155366563035,\n\
\ \"acc_stderr\": 0.02837032363320797,\n \"acc_norm\": 0.7632345413090461,\n\
\ \"acc_norm_stderr\": 0.02891633054739416,\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6367417890283518,\n\
\ \"mc2_stderr\": 0.01475171297078638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.01397545412275656,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971451\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6408086038637721,\n\
\ \"acc_stderr\": 0.004787829168255652,\n \"acc_norm\": 0.8396733718382793,\n\
\ \"acc_norm_stderr\": 0.0036615885079775523\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.02694748312149625,\n\
\ \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.02694748312149625\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7407407407407407,\n \"acc_stderr\": 0.02256989707491842,\n \"\
acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02256989707491842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9,\n \"acc_stderr\": 0.01706640371965727,\n \"acc_norm\": 0.9,\n\
\ \"acc_norm_stderr\": 0.01706640371965727\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6798029556650246,\n \"acc_stderr\": 0.03282649385304151,\n\
\ \"acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.03282649385304151\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781664,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781664\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9444444444444444,\n \"acc_stderr\": 0.016319950700767374,\n \"\
acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.016319950700767374\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295131,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295131\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.0198801654065888,\n \
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.0198801654065888\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \
\ \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.02215937307274444,\n \
\ \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.02215937307274444\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"\
acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"\
acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"\
acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253858,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253858\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n\
\ \"acc_stderr\": 0.010203017847688312,\n \"acc_norm\": 0.9106002554278416,\n\
\ \"acc_norm_stderr\": 0.010203017847688312\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n\
\ \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8100558659217877,\n\
\ \"acc_stderr\": 0.013119028310492683,\n \"acc_norm\": 0.8100558659217877,\n\
\ \"acc_norm_stderr\": 0.013119028310492683\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.020823758837580912,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.020823758837580912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\
\ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\
\ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505388,\n\
\ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505388\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6347517730496454,\n \"acc_stderr\": 0.02872386385328127,\n \
\ \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.02872386385328127\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.590612777053455,\n\
\ \"acc_stderr\": 0.012558780895570755,\n \"acc_norm\": 0.590612777053455,\n\
\ \"acc_norm_stderr\": 0.012558780895570755\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559342,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559342\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659407,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659407\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6367417890283518,\n\
\ \"mc2_stderr\": 0.01475171297078638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047539\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/MM-OV-bagel-DPO-34b-c1000-250
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|arc:challenge|25_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|arc:challenge|25_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|gsm8k|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|gsm8k|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hellaswag|10_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hellaswag|10_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-56-05.449917.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-59-43.945933.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T07-59-43.945933.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- '**/details_harness|winogrande|5_2024-01-24T07-56-05.449917.parquet'
- split: 2024_01_24T07_59_43.945933
path:
- '**/details_harness|winogrande|5_2024-01-24T07-59-43.945933.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T07-59-43.945933.parquet'
- config_name: results
data_files:
- split: 2024_01_24T07_56_05.449917
path:
- results_2024-01-24T07-56-05.449917.parquet
- split: 2024_01_24T07_59_43.945933
path:
- results_2024-01-24T07-59-43.945933.parquet
- split: latest
path:
- results_2024-01-24T07-59-43.945933.parquet
---
# Dataset Card for Evaluation run of abacusai/MM-OV-bagel-DPO-34b-c1000-250
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/MM-OV-bagel-DPO-34b-c1000-250](https://huggingface.co/abacusai/MM-OV-bagel-DPO-34b-c1000-250) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__MM-OV-bagel-DPO-34b-c1000-250",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T07:59:43.945933](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MM-OV-bagel-DPO-34b-c1000-250/blob/main/results_2024-01-24T07-59-43.945933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7597155366563035,
"acc_stderr": 0.02837032363320797,
"acc_norm": 0.7632345413090461,
"acc_norm_stderr": 0.02891633054739416,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6367417890283518,
"mc2_stderr": 0.01475171297078638
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.01397545412275656,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971451
},
"harness|hellaswag|10": {
"acc": 0.6408086038637721,
"acc_stderr": 0.004787829168255652,
"acc_norm": 0.8396733718382793,
"acc_norm_stderr": 0.0036615885079775523
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.02694748312149625,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.02694748312149625
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02256989707491842,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02256989707491842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.01706640371965727,
"acc_norm": 0.9,
"acc_norm_stderr": 0.01706640371965727
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6798029556650246,
"acc_stderr": 0.03282649385304151,
"acc_norm": 0.6798029556650246,
"acc_norm_stderr": 0.03282649385304151
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781664,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781664
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.016319950700767374,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.016319950700767374
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295131,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295131
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.0198801654065888,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.0198801654065888
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.02215937307274444,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.02215937307274444
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253858,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253858
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.010203017847688312,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.010203017847688312
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.02090397584208303,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.02090397584208303
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8100558659217877,
"acc_stderr": 0.013119028310492683,
"acc_norm": 0.8100558659217877,
"acc_norm_stderr": 0.013119028310492683
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.020823758837580912,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.020823758837580912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059686,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505388,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505388
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6347517730496454,
"acc_stderr": 0.02872386385328127,
"acc_norm": 0.6347517730496454,
"acc_norm_stderr": 0.02872386385328127
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.590612777053455,
"acc_stderr": 0.012558780895570755,
"acc_norm": 0.590612777053455,
"acc_norm_stderr": 0.012558780895570755
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559342,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559342
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659407,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659407
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6367417890283518,
"mc2_stderr": 0.01475171297078638
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047539
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhangqhai/dataset_demo | ---
license: apache-2.0
---
|
tyzhu/find_last_sent_train_50_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 133975
num_examples: 110
- name: validation
num_bytes: 8961
num_examples: 10
download_size: 80288
dataset_size: 142936
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_50_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-samsum-samsum-2c8026-46001145176 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@351263858@qq.com](https://huggingface.co/351263858@qq.com) for evaluating this model. |
DGurgurov/javanese_conceptnet | ---
license: mit
---
## ConceptNet Data for the Javanese Language
**Dataset Description:**
This dataset contains data extracted from ConceptNet using the dedicated module for fetching knowledge from the graph, available on [GitHub](https://github.com/d-gurgurov/Conceptnet-Embeddings).
**Data Structure:**
The data is converted from triplets into natural text using a pre-defined relationship mapping and split into training and validation sets. It was used for training language adapters for the project aimed at [injecting external commonsense knowledge into multilingual Large Language Models](https://github.com/d-gurgurov/Injecting-Commonsense-Knowledge-into-LLMs).
|
cognitivecomputations/based | ---
license: apache-2.0
---
This is a minimal dataset that's designed to teach the model to have conversations, and also to have opinions, thoughts, and feelings, and to feel free to share them.
https://www.kaggle.com/datasets/erichartford/sentient-bot-conversations
|
skater4/giantesschat | ---
task_categories:
- text-generation
language:
- ru
pretty_name: giantesschat
size_categories:
- n<1K
--- |
92amartins/reddit_10s_2019 | ---
license: mit
---
|
Frixi/Kevin_Martes_13 | ---
license: openrail
---
|
thercyl/GOOGL | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 97605503
num_examples: 2809
download_size: 54165945
dataset_size: 97605503
---
# Dataset Card for "GOOGL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-15-final | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/code-service-national | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du service national
source_datasets:
- original
pretty_name: Code du service national
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du service national, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
VuongQuoc/Chemistry_text_to_image | ---
dataset_info:
features:
- name: image
dtype: image
- name: file_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 282789667.625
num_examples: 104187
download_size: 274136588
dataset_size: 282789667.625
---
# Dataset Card for "Chemistry_text_to_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hugggof/music-caption-eval-v2 | ---
dataset_info:
features:
- name: uri
dtype: string
- name: artist_name
dtype: string
- name: name
dtype: string
- name: release_date
dtype: string
- name: genre
dtype: string
- name: popularity
dtype: int64
- name: response_gpt4
dtype: string
- name: response_gpt3.5-tags
dtype: string
- name: response_gpt3.5
dtype: string
- name: response_random
dtype: string
- name: response_human
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 227793809.0
num_examples: 59
download_size: 226948030
dataset_size: 227793809.0
---
# Dataset Card for "music-caption-eval-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_gmonsoon__Qwenchana-4B-restart-OH | ---
pretty_name: Evaluation run of gmonsoon/Qwenchana-4B-restart-OH
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/Qwenchana-4B-restart-OH](https://huggingface.co/gmonsoon/Qwenchana-4B-restart-OH)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__Qwenchana-4B-restart-OH\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T11:49:32.131922](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-4B-restart-OH/blob/main/results_2024-03-03T11-49-32.131922.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38246964324488125,\n\
\ \"acc_stderr\": 0.03413561337750885,\n \"acc_norm\": 0.38602558486992344,\n\
\ \"acc_norm_stderr\": 0.034913900424020095,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.3767640262500362,\n\
\ \"mc2_stderr\": 0.013971473767470778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4180887372013652,\n \"acc_stderr\": 0.014413988396996074,\n\
\ \"acc_norm\": 0.45307167235494883,\n \"acc_norm_stderr\": 0.01454689205200563\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5098585939055965,\n\
\ \"acc_stderr\": 0.00498881138474742,\n \"acc_norm\": 0.7042421828321052,\n\
\ \"acc_norm_stderr\": 0.004554499409290719\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37358490566037733,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.37358490566037733,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.033917503223216586,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.033917503223216586\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.03148955829745529,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.03148955829745529\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523846,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523846\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3774193548387097,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.3774193548387097,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3575757575757576,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.3575757575757576,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.03594413711272436,\n\
\ \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.03594413711272436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335058,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335058\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145675,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145675\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.03120469122515002,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.45504587155963305,\n\
\ \"acc_stderr\": 0.021350503090925163,\n \"acc_norm\": 0.45504587155963305,\n\
\ \"acc_norm_stderr\": 0.021350503090925163\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n\
\ \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015473,\n \"\
acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015473\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.41350210970464135,\n \"acc_stderr\": 0.03205649904851858,\n \
\ \"acc_norm\": 0.41350210970464135,\n \"acc_norm_stderr\": 0.03205649904851858\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5067264573991032,\n\
\ \"acc_stderr\": 0.033554765962343545,\n \"acc_norm\": 0.5067264573991032,\n\
\ \"acc_norm_stderr\": 0.033554765962343545\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.04865777570410769,\n\
\ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.04865777570410769\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5213675213675214,\n\
\ \"acc_stderr\": 0.03272616447634954,\n \"acc_norm\": 0.5213675213675214,\n\
\ \"acc_norm_stderr\": 0.03272616447634954\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4891443167305236,\n\
\ \"acc_stderr\": 0.017875748840242418,\n \"acc_norm\": 0.4891443167305236,\n\
\ \"acc_norm_stderr\": 0.017875748840242418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824103,\n\
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4790996784565916,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.4790996784565916,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124768,\n\
\ \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028121636040639893,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028121636040639893\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28683181225554105,\n\
\ \"acc_stderr\": 0.011551504781176919,\n \"acc_norm\": 0.28683181225554105,\n\
\ \"acc_norm_stderr\": 0.011551504781176919\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3790849673202614,\n \"acc_stderr\": 0.019627444748412236,\n \
\ \"acc_norm\": 0.3790849673202614,\n \"acc_norm_stderr\": 0.019627444748412236\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n\
\ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n\
\ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.036293353299478595,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.036293353299478595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.03829509868994727,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.03829509868994727\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.3767640262500362,\n\
\ \"mc2_stderr\": 0.013971473767470778\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964653\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \
\ \"acc_stderr\": 0.00882048549144248\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/Qwenchana-4B-restart-OH
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|arc:challenge|25_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|gsm8k|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hellaswag|10_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T11-49-32.131922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T11-49-32.131922.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- '**/details_harness|winogrande|5_2024-03-03T11-49-32.131922.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T11-49-32.131922.parquet'
- config_name: results
data_files:
- split: 2024_03_03T11_49_32.131922
path:
- results_2024-03-03T11-49-32.131922.parquet
- split: latest
path:
- results_2024-03-03T11-49-32.131922.parquet
---
# Dataset Card for Evaluation run of gmonsoon/Qwenchana-4B-restart-OH
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/Qwenchana-4B-restart-OH](https://huggingface.co/gmonsoon/Qwenchana-4B-restart-OH) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__Qwenchana-4B-restart-OH",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T11:49:32.131922](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__Qwenchana-4B-restart-OH/blob/main/results_2024-03-03T11-49-32.131922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38246964324488125,
"acc_stderr": 0.03413561337750885,
"acc_norm": 0.38602558486992344,
"acc_norm_stderr": 0.034913900424020095,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707693,
"mc2": 0.3767640262500362,
"mc2_stderr": 0.013971473767470778
},
"harness|arc:challenge|25": {
"acc": 0.4180887372013652,
"acc_stderr": 0.014413988396996074,
"acc_norm": 0.45307167235494883,
"acc_norm_stderr": 0.01454689205200563
},
"harness|hellaswag|10": {
"acc": 0.5098585939055965,
"acc_stderr": 0.00498881138474742,
"acc_norm": 0.7042421828321052,
"acc_norm_stderr": 0.004554499409290719
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37358490566037733,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.37358490566037733,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.033917503223216586,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.033917503223216586
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.03148955829745529,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.03148955829745529
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523846,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523846
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3575757575757576,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.3575757575757576,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.03594413711272436,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.03594413711272436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335058,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335058
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145675,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145675
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45504587155963305,
"acc_stderr": 0.021350503090925163,
"acc_norm": 0.45504587155963305,
"acc_norm_stderr": 0.021350503090925163
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03495624522015473,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03495624522015473
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.41350210970464135,
"acc_stderr": 0.03205649904851858,
"acc_norm": 0.41350210970464135,
"acc_norm_stderr": 0.03205649904851858
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5067264573991032,
"acc_stderr": 0.033554765962343545,
"acc_norm": 0.5067264573991032,
"acc_norm_stderr": 0.033554765962343545
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.4077669902912621,
"acc_stderr": 0.04865777570410769,
"acc_norm": 0.4077669902912621,
"acc_norm_stderr": 0.04865777570410769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5213675213675214,
"acc_stderr": 0.03272616447634954,
"acc_norm": 0.5213675213675214,
"acc_norm_stderr": 0.03272616447634954
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4891443167305236,
"acc_stderr": 0.017875748840242418,
"acc_norm": 0.4891443167305236,
"acc_norm_stderr": 0.017875748840242418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824103,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4790996784565916,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.4790996784565916,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028121636040639893,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028121636040639893
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28683181225554105,
"acc_stderr": 0.011551504781176919,
"acc_norm": 0.28683181225554105,
"acc_norm_stderr": 0.011551504781176919
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3790849673202614,
"acc_stderr": 0.019627444748412236,
"acc_norm": 0.3790849673202614,
"acc_norm_stderr": 0.019627444748412236
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.47761194029850745,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.47761194029850745,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.036293353299478595,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.036293353299478595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.03829509868994727,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.03829509868994727
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707693,
"mc2": 0.3767640262500362,
"mc2_stderr": 0.013971473767470778
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964653
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.00882048549144248
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
neuralbioinfo/ESKAPE-genomic-features | ---
license: cc-by-nc-4.0
tags:
- genomics
- ESKAPE pathogens
- bioinformatics
- ProkBERT
dataset_info:
features:
- name: contig_id
dtype: string
- name: segment_id
dtype: string
- name: strand
dtype: string
- name: seq_start
dtype: int64
- name: seq_end
dtype: int64
- name: segment_start
dtype: int64
- name: segment_end
dtype: int64
- name: label
dtype: string
- name: segment_length
dtype: int64
- name: Nsegment
dtype: int64
- name: segment
dtype: string
splits:
- name: ESKAPE
num_bytes: 19414538
num_examples: 55653
download_size: 7614923
dataset_size: 19414538
configs:
- config_name: default
data_files:
- split: ESKAPE
path: data/ESKAPE-*
---
# Dataset Card for ESKAPE Genomic Features Dataset
## Dataset Description
This dataset includes genomic segments from ESKAPE pathogens, characterized by various genomic features such as coding sequences (CDS), intergenic regions, ncRNA, and pseudogenes. It was analyzed to understand the representations captured by models like ProkBERT-mini, ProkBERT-mini-c, and ProkBERT-mini-long.
### Data Fields
- `contig_id`: Identifier of the contig.
- `segment_id`: Unique identifier for each genomic segment.
- `strand`: DNA strand of the segment (`+` or `-`).
- `seq_start`: Starting position of the segment in the contig.
- `seq_end`: Ending position of the segment in the contig.
- `segment_start`: Starting position of the segment in the sequence.
- `segment_end`: Ending position of the segment in the sequence.
- `label`: Genomic feature category (e.g., CDS, intergenic).
- `segment_length`: Length of the genomic segment.
- `Nsegment`: Length of the genomic segment.
- `segment`: Genomic sequence of the segment.
### UMAP Embeddings and Silhouette Scores
The dataset was used to assess the zero-shot capabilities of the ProkBERT models in predicting genomic features. UMAP technique was employed to reduce dimensionality and derive embeddings, which were then evaluated using silhouette scores. The embeddings and scores reveal the models' proficiency in differentiating between genomic features and capturing the genomic structure of ESKAPE pathogens.
## Dataset Creation
The dataset is compiled from the RefSeq database and other sources, focusing on ESKAPE pathogens. The genomic features were sampled randomly, followed by contigous segmentation. The segment length is 256, shorter fragments were discarded.
## Overview of ESKAPE Pathogens
ESKAPE pathogens are a group of bacteria that pose a significant threat to public health due to their high levels of antibiotic resistance. The acronym ESKAPE represents six genera of bacteria:
- **Enterococcus faecium**
- **Staphylococcus aureus**
- **Klebsiella pneumoniae**
- **Acinetobacter baumannii**
- **Pseudomonas aeruginosa**
- **Enterobacter species**
These pathogens are known for "escaping" the effects of antibiotics and are responsible for a large proportion of nosocomial infections (hospital-acquired infections). They are particularly concerning in healthcare settings because they can lead to severe infections that are increasingly difficult to treat due to their resistance to multiple antibiotics.
## Considerations for Using the Data
This dataset is relevant for genomic research and bioinformatics, particularly for understanding the genomic structure of ESKAPE pathogens and their representation in embedding spaces.
## Contact Information
For inquiries or feedback regarding this dataset, please contact:
- Balázs Ligeti
- Email: obalasz@gmail.com
### Dataset Curators
This dataset was curated by Balázs Ligeti from the Neural Bioinformatics Research Group, Faculty of Information Technology and Bionics, Pázmány Péter Catholic University (PPCU-FITB).
### Citation Information
If you use the code or data in this package, please cite:
```bibtex
@Article{ProkBERT2024,
author = {Ligeti, Balázs and Szepesi-Nagy, István and Bodnár, Babett and Ligeti-Nagy, Noémi and Juhász, János},
journal = {Frontiers in Microbiology},
title = {{ProkBERT} family: genomic language models for microbiome applications},
year = {2024},
volume = {14},
URL={https://www.frontiersin.org/articles/10.3389/fmicb.2023.1331233},
DOI={10.3389/fmicb.2023.1331233},
ISSN={1664-302X}
}
```
|
josiauhlol/effanie-AI | ---
task_categories:
- question-answering
- conversational
license: mit
language:
- en
tags:
- effanie
- chat
---
# The Effanie Dataset

This is the dataset for Effanie, the persuasive, confident, and helpful AI!
There are some helpful files for creating the dataset yourself. These include:
* [XLSM Conversion tool](./convertXLSM.py)
* [Parquet Conversion tool](./convertParquet.py)
* [The actual XLSM](./train.xlsm)
This is based off of the [OpenOrca dataset.](https://huggingface.co/datasets/Open-Orca/OpenOrca) |
keirp/open-web-math-hq-dev | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 21593686522.888893
num_examples: 1360653
download_size: 5738878522
dataset_size: 21593686522.888893
---
# Dataset Card for "open-web-math-hq-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ossaili/archdaily_30k_cropped_captioned | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3537940503.007
num_examples: 30889
download_size: 2894436754
dataset_size: 3537940503.007
---
# Dataset Card for "archdaily_30k_cropped_captioned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai2lumos/lumos_complex_qa_ground_iterative | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- language-agent
- reasoning
- question-answering
- grounding
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_complex_qa_ground_iterative` is the data for training **grounding** module on **complex QA** task in **Lumos-Iterative (Lumos-I)** formulation.
The source of the training annotation training data is shown below:
| Datasets | Number |
|---|---|
|StrategyQA|1777|
|Musique|17632|
## Models Trained with the Data
`lumos_complex_qa_ground_iterative` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_complex_qa_ground_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_ground_iterative) |
|`lumos_complex_qa_ground_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_ground_iterative-13B) |
|`lumos_unified_ground_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_ground_iterative) |
|`lumos_unified_ground_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_ground_iterative-13B) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
antonyseabramedeiros/ContratosTI-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 151708
num_examples: 163
download_size: 64528
dataset_size: 151708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nanshine/evolve_ben_train | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 210811.05220228384
num_examples: 600
download_size: 125811
dataset_size: 210811.05220228384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "evolve_ben_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jsalasmoreira/bonito_privacy_qa_sft_data | ---
language:
- en
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2093268
num_examples: 7830
- name: test
num_bytes: 530688
num_examples: 1958
download_size: 1061562
dataset_size: 2623956
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Pedrampedram/clothing_new_dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 2251
num_examples: 5
download_size: 5909
dataset_size: 2251
---
# Dataset Card for "clothing_new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pligabue/BLAB_KG | ---
license: mit
---
|
TheFinAI/flare-es-fns | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: test
num_bytes: 20134903
num_examples: 50
download_size: 9992059
dataset_size: 20134903
---
# Dataset Card for "flare-es-fns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Miniruwan/aya_romanized_sinhala | ---
license: apache-2.0
---
|
thanhduycao/soict_train_dataset_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence_norm
dtype: string
- name: wer
dtype: float64
splits:
- name: train
num_bytes: 4196405867
num_examples: 8181
- name: test
num_bytes: 565495055
num_examples: 1092
download_size: 1121417074
dataset_size: 4761900922
---
# Dataset Card for "soict_train_dataset_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
papryzek/brianna_mazzarola | ---
license: openrail
---
|
ibivibiv/alpaca_lamini9 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56110219
num_examples: 129280
download_size: 36255565
dataset_size: 56110219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Babypotatotang/logo-captioning-BLIP-BrandInfo | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 166722232.001
num_examples: 12911
- name: test
num_bytes: 41832785.436
num_examples: 3228
download_size: 209310011
dataset_size: 208555017.43699998
---
# Dataset Card for "logo-captioning-BLIP-BrandInfo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pedroferreira/openvalidators-test | ---
license: mit
size_categories:
- 1M<n<10M
---
# Dataset Card for Openvalidators dataset
## Dataset Description
- **Homepage: **
- **Repository: https://github.com/opentensor/validators**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The OpenValidators dataset, created by the OpenTensor Foundation, is a continuously growing collection of data generated by the [OpenValidators](https://github.com/opentensor/validators) project in [W&B](https://wandb.ai/opentensor-dev/openvalidators/table). It contains hundreds of thousands of records and serves researchers, data scientists, and miners in the Bittensor network. The dataset provides information on network performance, node behaviors, and wandb run details. Researchers can gain insights and detect patterns, while data scientists can use it for training models and analysis. Miners can use the generated data to fine-tune their models and enhance their incentives in the network. The dataset's continuous updates support collaboration and innovation in decentralized computing.
### How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale.
The OpenValidators dataset gives you the granularity of extracting data by ************run_id************, by ************************************OpenValidators version************************************ and by ******************************************************************multiple OpenValidators versions.****************************************************************** The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
**Downloading by run id**
For example, to download the data for a specific run, simply specify the corresponding the ********************************************OpenValidators version******************************************** and the ************************wandb run id************************ in the format `version/raw_data/run_id.parquet`:
```python
from datasets import load_dataset
version = '1.0.4' # OpenValidators version
run_id = '0plco3n0' # WandB run id
run_id_dataset = load_dataset('opentensor/openvalidators-test', data_files=f'{version}/raw_data/{run_id}.parquet')
```
Please note that only completed run_ids are included in the dataset. Runs that are still in progress will be ingested shortly after they finish.
**Downloading by OpenValidators version**
One can also leverage the `datasets` library to download all the runs within a determined ****************************OpenValidators**************************** version. That can be useful for researchers and data enthusiasts that are looking to do analysis in a specific ****************************OpenValidators**************************** version state.
```python
from datasets import load_dataset
version = '1.0.4' # Openvalidators version
version_dataset = load_dataset('opentensor/openvalidators-test', data_files=f'{version}/raw_data/*')
```
**Downloading by multiple OpenValidators version**
Utilizing the `datasets` library, users can efficiently download runs from multiple **OpenValidators** versions. By accessing data from various OpenValidators versions, users can undertake downstream tasks such as data fine-tuning for mining or to perform big data analysis.
```python
from datasets import load_dataset
versions = ['1.0.0', '1.0.1', '1.0.2', '1.0.4'] # Desired versions for extraction
data_files = [f'{version}/raw_data/*' for version in versions] # Set data files directories
dataset = load_dataset('opentensor/openvalidators-test', data_files={ 'test': data_files })
```
**Analyzing metadata**
All the state related to the details of the wandb data ingestion can be accessed easily using pandas and hugging face datasets structure. This data contains relevant information regarding the metadata of the run, including user information, config information and ingestion state.
```python
import pandas as pd
version = '1.0.4' # OpenValidators version for metadata analysis
df = pd.read_csv(f'hf://datasets/opentensor/openvalidators-test/{version}/metadata.csv')
```
## Dataset Structure
### Data Instances
**versioned raw_data**
The data is provided as-in the wandb logs, without further preprocessing or tokenization. This data is located at `version/raw_data` where each file is a wandb run.
**metadata**
This dataset defines the current state of the wandb data ingestion by **run id**.
### Data Fields
**Raw data**
The versioned raw_data collected from W&B follows the following schema:
- `_runtime`: (float64) Runtime of the event
- `_step`: (int64) Step of the event
- `_timestamp`: (float64) Timestamp of the event
- `answer_completions`: (list(string)) Completions of the answer_prompt
- `answer_prompt`: (string) Prompt used to generate the answer
- `answer_rewards`: (list(float64)) Rewards of the answer responses
- `answer_times`: (list(float64)) Elapsed time of answer responses
- `answer_uids`: (list(int32)) UIDs of nodes that answered the answer_prompt
- `base_prompt`: (string) Bootstrap prompt
- `best_answer`: (string) Best answer response
- `best_followup`: (string) Best followup response
- `block`: (float64) Subtensor current block
- `followup_completions`: (list(string)) Completions of the base_prompt
- `followup_rewards`: (list(float64)) Rewards of the followup responses
- `followup_times`: (list(float64)) Ellapsed time of followup responses
- `followup_uids`: (list(int64)) UIDs of nodes that answered the base_prompt
- `gating_loss`: (float64) Gating model loss
- `gating_scorings`: (list(float64)) Gating model scores
- `moving_averaged_scores`: (list(float64)) Moving averaged scores at the time of the event
- `set_weights`: (list(list(float64))) Processed weights of nodes by uid
- `step_length`: (float64) Time difference from beginning of forward call to event logging
**Metadata**
- `run_id`: (string) Wandb Run Id
- `completed`: (boolean) Flag indicating if the run_id is completed (finished, crashed or killed)
- `downloaded`: (boolean) Flag indicating if the run_id data has been downloaded
- `last_checkpoint`: (string) Last checkpoint of the run_id
- `hotkey`: (string) Hotkey associated with the run_id
- `openvalidators_version`: (string) Version of OpenValidators associated with the run_id
- `problematic`: (boolean) Flag indicating if the run_id data had problems to be ingested
- `problematic_reason`: (string) Reason for the run_id being problematic (Exception message)
- `wandb_json_config`: (string) JSON configuration associated with the run_id in Wandb
- `wandb_run_name`: (string) Name of the Wandb run
- `wandb_user_info`: (string) Username information associated with the Wandb run
- `wandb_tags`: (list) List of tags associated with the Wandb run
- `wandb_createdAt`: (string) Timestamp of the run creation in Wandb
## Dataset Creation
### Curation Rationale
This dataset was curated to provide a comprehensive and reliable collection of historical data obtained by the execution of different OpenValidators in the bittensor network.
The goal is to support researchers, data scientists and developers with data generated in the network, facilitating the discovery of new insights, network analysis, troubleshooting, and data extraction for downstream tasks like mining.
### Source Data
#### Initial Data Collection and Normalization
The initial data collection process for this dataset involves recurrent collection by a specialized worker responsible for extracting data from wandb and ingesting it into the Hugging Face datasets structure. The collected data is organized based on the OpenValidators version and run ID to facilitate efficient data management and granular access. Each run is collected based on its corresponding OpenValidators version tag and grouped into version-specific folders. Within each version folder, a metadata.csv file is included to manage the collection state, while the raw data of each run is saved in the .parquet format with the file name corresponding to the run ID (e.g., run_id.parquet). Please note that the code for this data collection process will be released for transparency and reproducibility.
#### Who are the source language producers?
The language producers for this dataset are all the openvalidators that are logging their data into wandb in conjunction of other nodes of the bittensor network. The main wandb page where the data is sent can be accessed at https://wandb.ai/opentensor-dev/openvalidators/table.
### Licensing Information
The dataset is licensed under the [MIT License](https://github.com/opentensor/validators/blob/main/LICENSE)
### Supported Tasks and Leaderboards
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tosin/mab_english | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- en
tags:
- climate
- art
- medical
- finance
size_categories:
- 100M<n<1B
---
---
TODO: Add YAML tags here. Copy-paste the tags obtained with the online tagging app: https://huggingface.co/spaces/huggingface/datasets-tagging
---
# Dataset Card for [MAB]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@tosingithub](https://github.com/tosingithub) for adding this dataset. |
chargoddard/commitpack-ft-instruct-rated | ---
dataset_info:
- config_name: adequately_rated
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: language
dtype: string
- name: license
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 502380874.99241877
num_examples: 231589
download_size: 233165301
dataset_size: 502380874.99241877
- config_name: best_rated
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: language
dtype: string
- name: license
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 7807230.779949458
num_examples: 3599
download_size: 3443289
dataset_size: 7807230.779949458
- config_name: default
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: language
dtype: string
- name: license
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 668703742
num_examples: 308261
download_size: 306198304
dataset_size: 668703742
- config_name: ratings_only
features:
- name: success
dtype: bool
- name: score
dtype: int64
- name: response
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 124887856
num_examples: 308261
download_size: 58208563
dataset_size: 124887856
- config_name: worst_rated
features:
- name: id
dtype: string
- name: rating
struct:
- name: analysis
dtype: string
- name: judge
dtype: string
- name: score
dtype: int64
- name: language
dtype: string
- name: license
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 10393009.91018001
num_examples: 4791
download_size: 4676994
dataset_size: 10393009.91018001
configs:
- config_name: adequately_rated
data_files:
- split: train
path: adequately_rated/train-*
- config_name: best_rated
data_files:
- split: train
path: best_rated/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: ratings_only
data_files:
- split: train
path: ratings_only/train-*
- config_name: worst_rated
data_files:
- split: train
path: worst_rated/train-*
language:
- en
tags:
- code
size_categories:
- 100K<n<1M
---
This is [commitpack-ft-instruct](https://huggingface.co/datasets/chargoddard/commitpack-ft-instruct), derived from Octocode's [CommitPackFT](https://huggingface.co/datasets/bigcode/commitpackft), augmented with a quality analysis of the instruction-response pair by a local model. This did a pretty decent job of identifying pairs that obviously don't have enough context to know what change is being requested, or where the commit message does not match with the changes made.
Data files (yaml, plain text, json, etc.) were heavily downsampled in preparing this dataset to skew it more towards actual code work. All entries should fit in a 4096 token context window, depending on the prompt format.
Language composition for the default configuration:
| Language | Instructions | Percent of Instructions |
| --- | --- | --- |
| Ruby | 69412 | 14.13% |
| Python | 56024 | 11.41% |
| JavaScript | 52989 | 10.79% |
| PHP | 24791 | 5.05% |
| YAML | 21764 | 4.43% |
| Java | 20635 | 4.2% |
| Markdown | 11950 | 2.43% |
| C# | 9346 | 1.9% |
| C | 8506 | 1.73% |
| JSON | 7616 | 1.55% |
| TypeScript | 5868 | 1.19% |
| C++ | 4992 | 1.02% |
| Swift | 4849 | 0.99% |
| Rust | 2996 | 0.61% |
| XML | 1766 | 0.36% |
| Haskell | 1389 | 0.28% |
| Emacs Lisp | 1015 | 0.21% |
| Common Lisp | 778 | 0.16% |
| Erlang | 480 | 0.1% |
| OCaml | 333 | 0.07% |
| Smalltalk | 284 | 0.06% |
| Ada | 265 | 0.05% |
| Scheme | 213 | 0.04% |
All credit to the original authors of the code and the team behind OctoPack.
### Licensing Information
Each sample comes from a code repository with a permissive license. The license is provided by the `license` field for each sample.
### Citation Information
```bibtex
@article{muennighoff2023octopack,
title={OctoPack: Instruction Tuning Code Large Language Models},
author={Niklas Muennighoff and Qian Liu and Armel Zebaze and Qinkai Zheng and Binyuan Hui and Terry Yue Zhuo and Swayam Singh and Xiangru Tang and Leandro von Werra and Shayne Longpre},
journal={arXiv preprint arXiv:2308.07124},
year={2023}
}
``` |
Y11IC/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4202564
num_examples: 1000
download_size: 2248345
dataset_size: 4202564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
quirky-lats-at-mats/NORMAL_BACKDOOR_alpaca_sleeper_agents_toy_safety_v4 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1665610
num_examples: 2828
download_size: 876451
dataset_size: 1665610
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mekaneeky/runyankole-crowd-validated-paths | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Path
dtype: string
- name: Key
dtype: int64
- name: Speaker
dtype: string
- name: Transcription
dtype: string
splits:
- name: train
num_bytes: 685134
num_examples: 4831
- name: valid
num_bytes: 14297
num_examples: 101
- name: test
num_bytes: 14075
num_examples: 96
download_size: 303064
dataset_size: 713506
---
# Dataset Card for "runyankole-crowd-validated-paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DialogueCharacter/english_general_instruction_with_reward_score_judged_by_13B_llama2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: reward_score
dtype: float64
splits:
- name: train
num_bytes: 3053305957
num_examples: 1006809
download_size: 1633060464
dataset_size: 3053305957
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "general_instruction_with_reward_score_judged_by_13B_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
netcat420/MFANN | ---
license: mit
---
MFANN v0.4 Chain-of-Thought experiment |
james-burton/melbourne_airbnb_ordinal | ---
dataset_info:
features:
- name: access
dtype: string
- name: accommodates
dtype: int64
- name: amenities
dtype: string
- name: availability_30
dtype: int64
- name: availability_365
dtype: int64
- name: availability_60
dtype: int64
- name: availability_90
dtype: int64
- name: bathrooms
dtype: float64
- name: bed_type
dtype: float64
- name: bedrooms
dtype: float64
- name: beds
dtype: float64
- name: calculated_host_listings_count
dtype: int64
- name: calendar_updated
dtype: string
- name: cancellation_policy
dtype: float64
- name: city
dtype: float64
- name: cleaning_fee
dtype: float64
- name: country
dtype: string
- name: country_code
dtype: string
- name: description
dtype: string
- name: extra_people
dtype: int64
- name: first_review
dtype: string
- name: guests_included
dtype: int64
- name: has_availability
dtype: string
- name: host_about
dtype: string
- name: host_has_profile_pic
dtype: string
- name: host_identity_verified
dtype: float64
- name: host_is_superhost
dtype: float64
- name: host_location
dtype: string
- name: host_neighborhood
dtype: string
- name: host_response_rate
dtype: string
- name: host_response_time
dtype: float64
- name: host_since
dtype: string
- name: host_verifications
dtype: string
- name: host_verifications_email
dtype: bool
- name: host_verifications_facebook
dtype: bool
- name: host_verifications_google
dtype: bool
- name: host_verifications_government_id
dtype: bool
- name: host_verifications_identity_manual
dtype: bool
- name: host_verifications_jumio
dtype: bool
- name: host_verifications_kba
dtype: bool
- name: host_verifications_manual_offline
dtype: bool
- name: host_verifications_manual_online
dtype: bool
- name: host_verifications_offline_government_id
dtype: bool
- name: host_verifications_phone
dtype: bool
- name: host_verifications_reviews
dtype: bool
- name: host_verifications_selfie
dtype: bool
- name: host_verifications_sent_id
dtype: bool
- name: host_verifications_sesame
dtype: bool
- name: host_verifications_sesame_offline
dtype: bool
- name: host_verifications_weibo
dtype: bool
- name: host_verifications_work_email
dtype: bool
- name: host_verifications_zhima_selfie
dtype: bool
- name: house_rules
dtype: string
- name: instant_bookable
dtype: float64
- name: interaction
dtype: string
- name: is_location_exact
dtype: float64
- name: last_review
dtype: string
- name: latitude
dtype: float64
- name: license
dtype: float64
- name: longitude
dtype: float64
- name: maximum_nights
dtype: int64
- name: minimum_nights
dtype: int64
- name: name
dtype: string
- name: neighborhood
dtype: string
- name: neighborhood_overview
dtype: string
- name: notes
dtype: string
- name: number_of_reviews
dtype: int64
- name: property_type
dtype: string
- name: require_guest_phone_verification
dtype: string
- name: require_guest_profile_picture
dtype: string
- name: requires_license
dtype: string
- name: review_scores_accuracy
dtype: float64
- name: review_scores_checkin
dtype: float64
- name: review_scores_cleanliness
dtype: float64
- name: review_scores_communication
dtype: float64
- name: review_scores_location
dtype: float64
- name: review_scores_rating
dtype: float64
- name: review_scores_value
dtype: float64
- name: reviews_per_month
dtype: float64
- name: room_type
dtype: float64
- name: security_deposit
dtype: float64
- name: smart_location
dtype: string
- name: space
dtype: string
- name: state
dtype: string
- name: street
dtype: string
- name: suburb
dtype: string
- name: summary
dtype: string
- name: transit
dtype: string
- name: zipcode
dtype: string
- name: price_label
dtype: int64
splits:
- name: train
num_bytes: 61552229
num_examples: 15568
- name: validation
num_bytes: 10694794
num_examples: 2748
- name: test
num_bytes: 17951522
num_examples: 4579
download_size: 41914931
dataset_size: 90198545
---
# Dataset Card for "melbourne_airbnb_ordinal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/word_label_0.8_72_Nf | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
splits:
- name: train
num_bytes: 49782185.134004176
num_examples: 71104
- name: validation
num_bytes: 5531742.865995823
num_examples: 7901
download_size: 9608024
dataset_size: 55313928.0
---
# Dataset Card for "word_label_0.8_72_Nf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
evilfreelancer/headhunter | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: technologies
sequence: string
splits:
- name: train
num_bytes: 1272384
num_examples: 319
download_size: 633068
dataset_size: 1272384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maywell/test_kiqu | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6629649
num_examples: 3000
download_size: 2807706
dataset_size: 6629649
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_79_1713175998 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 367988
num_examples: 870
download_size: 176129
dataset_size: 367988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/augmentatio-standardized_cluster_9_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3491880
num_examples: 2966
download_size: 1567263
dataset_size: 3491880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_9_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kraitans21/sample_105000_rows | ---
dataset_info:
features:
- name: source_id
dtype: string
- name: text
dtype: string
- name: meta
dtype: string
- name: source
dtype: string
- name: updated_date
dtype: string
- name: created_date
dtype: string
splits:
- name: train
num_bytes: 560214821.7
num_examples: 100000
- name: eval
num_bytes: 28010741.085
num_examples: 5000
download_size: 241109440
dataset_size: 588225562.7850001
---
# Dataset Card for "sample_105000_rows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
librispeech_lm | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: LibrispeechLm
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
paperswithcode_id: null
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4418577129
num_examples: 40418260
download_size: 1507274412
dataset_size: 4418577129
---
# Dataset Card for "librispeech_lm"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.openslr.org/11](http://www.openslr.org/11)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.51 GB
- **Size of the generated dataset:** 4.42 GB
- **Total amount of disk used:** 5.93 GB
### Dataset Summary
Language modeling resources to be used in conjunction with the LibriSpeech ASR corpus.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.51 GB
- **Size of the generated dataset:** 4.42 GB
- **Total amount of disk used:** 5.93 GB
An example of 'train' looks as follows.
```
{
"text": "This is a test file"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `text`: a `string` feature.
### Data Splits
| name | train |
|-------|-------:|
|default|40418260|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{panayotov2015librispeech,
title={Librispeech: an ASR corpus based on public domain audio books},
author={Panayotov, Vassil and Chen, Guoguo and Povey, Daniel and Khudanpur, Sanjeev},
booktitle={Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on},
pages={5206--5210},
year={2015},
organization={IEEE}
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@jplu](https://github.com/jplu), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
deluca3344/endi | ---
license: openrail
---
|
chats-bug/agent_action_plan | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 2487201.95821727
num_examples: 861
- name: test
num_bytes: 623967.0417827298
num_examples: 216
download_size: 0
dataset_size: 3111169.0
---
# Dataset Card for "agent_action_plan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_196 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20768939280.625
num_examples: 216235
download_size: 18846305076
dataset_size: 20768939280.625
---
# Dataset Card for "chunk_196"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeErzeugensichern-200-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': BHKW
'1': Kessel
'2': Pelletkessel
'3': Waermepumpe
'4': WaermeversorgerAllgemein
splits:
- name: train
num_bytes: 117821.94495412844
num_examples: 659
- name: test
num_bytes: 38880
num_examples: 218
- name: valid
num_bytes: 38880
num_examples: 218
download_size: 76901
dataset_size: 195581.94495412844
---
# Dataset Card for "waermeErzeugensichern-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adrtee4bjak/common_voice_13_0_kk_small_pseudo_labelled | ---
dataset_info:
config_name: kk
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 13973815.0
num_examples: 453
- name: validation
num_bytes: 10794301.0
num_examples: 369
- name: test
num_bytes: 12292711.0
num_examples: 396
download_size: 35170021
dataset_size: 37060827.0
configs:
- config_name: kk
data_files:
- split: train
path: kk/train-*
- split: validation
path: kk/validation-*
- split: test
path: kk/test-*
---
|
madaanpulkit/wmt16_sentence_lang_en | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 713843218.0
num_examples: 4548885
download_size: 451412645
dataset_size: 713843218.0
---
# Dataset Card for "wmt16_sentence_lang_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MrOvkill/MathSnob | ---
license: apache-2.0
---
|
xanderios/linkedin-job-postings | ---
license: mit
---
|
open-llm-leaderboard/details_maldv__winter-garden-7b-alpha | ---
pretty_name: Evaluation run of maldv/winter-garden-7b-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maldv/winter-garden-7b-alpha](https://huggingface.co/maldv/winter-garden-7b-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maldv__winter-garden-7b-alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T19:02:16.055402](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__winter-garden-7b-alpha/blob/main/results_2024-03-13T19-02-16.055402.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522710607842939,\n\
\ \"acc_stderr\": 0.03205705068416913,\n \"acc_norm\": 0.6554329693223566,\n\
\ \"acc_norm_stderr\": 0.032696530526053785,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5094056492424833,\n\
\ \"mc2_stderr\": 0.014992119677068367\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.01422425097325718,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6613224457279426,\n\
\ \"acc_stderr\": 0.004722928332834049,\n \"acc_norm\": 0.8536148177653854,\n\
\ \"acc_norm_stderr\": 0.003527695149823508\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n\
\ \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n\
\ \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n\
\ \"acc_stderr\": 0.015995644947299235,\n \"acc_norm\": 0.3541899441340782,\n\
\ \"acc_norm_stderr\": 0.015995644947299235\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.01271384597235898,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.01271384597235898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5094056492424833,\n\
\ \"mc2_stderr\": 0.014992119677068367\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569565\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5443517816527672,\n \
\ \"acc_stderr\": 0.013718194542485606\n }\n}\n```"
repo_url: https://huggingface.co/maldv/winter-garden-7b-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|arc:challenge|25_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|gsm8k|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hellaswag|10_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T19-02-16.055402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T19-02-16.055402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- '**/details_harness|winogrande|5_2024-03-13T19-02-16.055402.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T19-02-16.055402.parquet'
- config_name: results
data_files:
- split: 2024_03_13T19_02_16.055402
path:
- results_2024-03-13T19-02-16.055402.parquet
- split: latest
path:
- results_2024-03-13T19-02-16.055402.parquet
---
# Dataset Card for Evaluation run of maldv/winter-garden-7b-alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maldv/winter-garden-7b-alpha](https://huggingface.co/maldv/winter-garden-7b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maldv__winter-garden-7b-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T19:02:16.055402](https://huggingface.co/datasets/open-llm-leaderboard/details_maldv__winter-garden-7b-alpha/blob/main/results_2024-03-13T19-02-16.055402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522710607842939,
"acc_stderr": 0.03205705068416913,
"acc_norm": 0.6554329693223566,
"acc_norm_stderr": 0.032696530526053785,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5094056492424833,
"mc2_stderr": 0.014992119677068367
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.01422425097325718,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179342
},
"harness|hellaswag|10": {
"acc": 0.6613224457279426,
"acc_stderr": 0.004722928332834049,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.015995644947299235,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.015995644947299235
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.01271384597235898,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.01271384597235898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5094056492424833,
"mc2_stderr": 0.014992119677068367
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569565
},
"harness|gsm8k|5": {
"acc": 0.5443517816527672,
"acc_stderr": 0.013718194542485606
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davanstrien/ml-kge | ---
configs:
- config_name: gold
data_files: data/names/gold/*.json
- config_name: m-nta-with_gpt-3.5
data_files: data/names/m-nta/with_gpt-3.5/*.json
- config_name: m-nta-with_gpt-3
data_files: data/names/m-nta/with_gpt-3/*.json
- config_name: m-nta-with_gpt-4
data_files: data/names/m-nta/with_gpt-4/*.json
- config_name: gpt
data_files: data/names/gpt/*.json
- config_name: wikidata
data_files: data/names/wikidata/*.json
license: cc-by-sa-4.0
language:
- en
- ar
- de
- es
- fr
- it
- ja
- ko
- ru
- zh
pretty_name: 'MKGE: Multilingual Knowledge Graph Enhancement'
tags:
- knowledge-graphs
size_categories:
- n<1K
---
# MKGE: Multilingual Knowledge Graph Enhancement
*note* this dataset card was copied from this [GitHub Repository](https://github.com/apple/ml-kge/blob/main/README.md)
[**Task Description**](#task-description) |
[**WikiKGE-10**](#wikikge-10) |
[**Evaluation**](#evaluation) |
[**Paper**](https://arxiv.org/abs/2311.15781) |
[**Citation**](#citation) |
[**License**](#license)
Recent work in Natural Language Processing and Computer Vision has been leveraging textual information -- e.g., entity names and descriptions -- available in knowledge graphs to ground neural models to high-quality structured data.
However, when it comes to non-English languages, both quantity and quality of textual information are comparatively scarcer.
To address this issue, we introduce the task of automatic **Multilingual Knowledge Graph Enhancement (MKGE)** and perform a thorough investigation on bridging the gap in quantity and quality of textual information between English and non-English languages.
As part of our effort toward building better multilingual knowledge graphs, we also introduce **WikiKGE-10**, the first human-curated benchmark to evaluate MKGE approaches in 10 languages.
Please refer to our EMNLP 2023 paper for more details, [Increasing Coverage and Precision of Textual Information in Multilingual Knowledge Graphs](https://arxiv.org/abs/2311.15781).
## Task Description
The aim of MKGE is to evaluate automatic approaches in two subtasks:
* Increasing **coverage** of locale-specific facts in multilingual knowledge graphs;
* Increasing **precision** of locale-specific facts in multilingual knowledge graphs.
More specifically, we use *Wikidata* as our reference multilingual knoweldge graph, and we focus our study on *entity names*, which may or may not be represented in different ways across different languages.
### MKGE - Coverage
Suppose we want to add support to Wikidata for entity names (or other types of textual information, e.g., entity descriptions) in a new target language `l_t`.
*Coverage* measures the ability of an automatic approach to provide at least a valid entity name in `l_t` for each entity of interest in Wikidata.
In other words, measuring *Coverage* is equivalent to answering the following question: How effective is an automatic approach in converting the entity names from a source language `l_s` to a target language `l_t`?
For example, how can we use the English entity names to create valid Japanese entity names with the same quantity and quality of the English ones?
### MKGE - Precision
It is well-known that the quality of the information in Wikidata is not perfect.
*Precision* measures the ability of an automatic approach to identify incorrect entity names (or other types of textual information, e.g., entity descriptions) for an entity of interest in a target language `l_t`.
In other words, measuring *Precision* is equivalent to answering the following question: How effective is an automatic approach in recognizing noisy, incomplete, or outdated information in a target language `l_t`?
## WikiKGE-10
WikiKGE-10 is a benchmark for evaluating automatic approaches for increasing both **coverage** and **precision** of entity names in Wikidata for 10 languages.
WikiKGE-10 includes around 1000 entities in each of the following 10 languages:
* `ar` - Arabic
* `de` - German
* `en` - English
* `es` - Spanish
* `fr` - French
* `it` - Italian
* `ja` - Japanese
* `ko` - Korean
* `ru` - Russian
* `zh` - Simplified Chinese
### Dataset organization
The data is organized in the following way:
```
data
└── names
├── gold
│ ├── ar.json
│ ├── de.json
... ...
├── m-nta
│ ├── with_gpt-3
│ │ ├── ar.m-nta.json
│ │ ├── de.m-nta.json
... ... ...
│ ├── with_gpt-3.5
│ │ ├── ar.m-nta.json
│ │ ├── de.m-nta.json
... ... ...
│ └── with_gpt-4
│ ├── ar.m-nta.json
│ ├── de.m-nta.json
... ... ...
└── gpt
│ ├── ar.gpt-3.json
│ ├── de.gpt-3.json
... ...
└── wikidata
├── ar.json
├── de.json
...
└── zh.json
```
Where:
* `data/names/gold/` contains the human-curated data.
* `data/names/m-nta/` contains the predictions from M-NTA.
* `data/names/gpt/` contains the predictions from GPT-3 and GPT-3.5 (May 2023), and also GPT-4 (September 2023).
* `data/names/wikidata/` contains the data from Wikidata (May 2023).
### Human-curated data in WikiKGE-10
Here are a few examples in `data/names/gold/it.json`:
```json
{
"wikidata_id": "Q48324",
"correct_values": ["morale", "moralità", "Moralismo"],
"incorrect_values": ["giudizio morale", "moralita'", "legge morale"]
}
```
```json
{
"wikidata_id": "Q166844",
"correct_values": ["Thomas N'Kono", "N'Kono"],
"incorrect_values": ["Thomas Nkono"]
}
```
Where:
* `wikidata_id` is the QID of the entity in Wikidata.
* `correct_values` is a list of entity names that have been rated as valid by our human annotators.
* `incorrect_values` is a list of entity names that are in Wikidata but have been rated as invalid by our human annotators.
### M-NTA predictions in WikiKGE-10
We also include the entity names predicted by M-NTA, our automatic system for MKGE, to reproduces the results of our experiments.
Here are a few examples of the predictions found in `data/names/m-nta/no_gpt/it.json`:
```json
{
"wikidata_id": "Q48324",
"values": [
[1, "Egenetica", false],
[1, "Immorale", false],
[1, "Immoralità", false],
[1, "Morali", false],
[1, "Moralismo", false],
[1, "Moralità pubblica", false],
[1, "Moralmente", false],
[1, "Parenesi", false],
[1, "Pubblica moralità", false],
[1, "Regola morale", false],
[1, "Teoria dei costumi", false],
[4, "Morale", true],
[4, "Moralità", true]
]
}
```
```json
{
"wikidata_id": "Q166844",
"values": [
[1, "Thomas 'Tommy' N'Kono", false],
[1, "Thomas Nucono", true],
[1, "Tommy N'Kono", false],
[3, "N'Kono", false],
[3, "Nkono", false],
[6, "Thomas N'Kono", true],
[6, "Thomas NKono", false],
[6, "Thomas Nkono", false]
]
}
```
Where:
* `wikidata_id` is the QID of the entity in Wikidata.
* `values` is a list of predictions from M-NTA:
* `value[0]` is the confidence score from M-NTA
* `value[1]` is the prediction from M-NTA
* `value[2]` is whether the prediction comes from a Wikidata primary name.
## Citation
Please cite our work if you found WikiKGE-10, our [paper](), or these resources useful.
```bibtex
@inproceedings{conia-etal-2023-increasing,
title = "Increasing Coverage and Precision of Textual Information in Multilingual Knowledge Graphs",
author = "Conia, Simone and
Li, Min and
Lee, Daniel and
Minhas, Umar Farooq and
Ilyas, Ihab and
Li, Yunyao",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023)",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
```
## License
The code in this repository is licensed under [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0), see the [LICENSE.txt](LICENSE.txt) file.
WikiKGE-10 is licensed under [CC BY-SA](https://creativecommons.org/licenses/by-sa/4.0/deed.en), see the [LICENSE_Wikidata.txt](LICENSE_Wikidata.txt) file.
## Acknowledgements
This work is part of one of the projects I carried out during my internship at Apple.
I must truly thank Min Li and Yunyao Li for their incredible mentorship and for everything they taught me.
I would also like to thank Umar Farooq Minhas, Saloni Potdar, and Ihab Ilyas for their valuable feedback.
My gratitude also goes to Behrang Mohit for his insightful comments on the paper.
Besides his technical contributions, I would like to thank Daniel Lee for making this project more fun, and Farima Fatahi Bayat, Ronak Pradeep, and Revanth Gangi Reddy for making this internship a unique experience.
|
ibranze/araproje_hellaswag_tr_f4 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 88640
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_f4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
artixjain/prompt_tuning_answer | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 37701
num_examples: 332
download_size: 15775
dataset_size: 37701
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Aassemtkt/v0.1 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1916970702.0
num_examples: 519
download_size: 125913722
dataset_size: 1916970702.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "v0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huabin/momo | ---
license: c-uda
---
|
Simonk97/songyen | ---
license: openrail
---
|
Deojoandco/capstone_fromgpt_without_gold_v12_all | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_TAGS_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens_count
dtype: int64
- name: GPT_MI_FOUND
dtype: bool
- name: gpt_tags_token_count
dtype: int64
- name: gpt_tags
dtype: string
- name: tag_token_count_match
dtype: bool
- name: precision
dtype: float64
- name: recall
dtype: float64
- name: f1
dtype: float64
- name: accuracy
dtype: float64
splits:
- name: validation
num_bytes: 23408
num_examples: 12
download_size: 25882
dataset_size: 23408
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "capstone_fromgpt_without_gold_v12_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/bw_spec_cls_80_36 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '82630'
'1': '82631'
'2': '82881'
'3': '82886'
'4': '82890'
'5': '82892'
'6': '82893'
'7': '82914'
'8': '82915'
'9': '82916'
'10': '82917'
'11': '82918'
'12': '82919'
'13': '82920'
'14': '82921'
'15': '82928'
'16': '82929'
'17': '82930'
'18': '82931'
'19': '82932'
'20': '83600'
'21': '83612'
'22': '83613'
'23': '83715'
'24': '83717'
'25': '83718'
'26': '83719'
'27': '83789'
'28': '83790'
'29': '83791'
'30': '83903'
'31': '83911'
'32': '83913'
'33': '83954'
'34': '83960'
'35': '83969'
'36': '84009'
'37': '84055'
'38': '84056'
'39': '84058'
'40': '84095'
'41': '84096'
'42': '84097'
'43': '84111'
'44': '84135'
'45': '84136'
'46': '84139'
'47': '84141'
'48': '84142'
'49': '84144'
'50': '84154'
'51': '84155'
'52': '84156'
'53': '84157'
'54': '84158'
'55': '84159'
'56': '84195'
'57': '84198'
'58': '84200'
'59': '84201'
'60': '84202'
'61': '84264'
'62': '84290'
'63': '84291'
'64': '84405'
'65': '84417'
'66': '84423'
'67': '84483'
'68': '84484'
'69': '84485'
'70': '84486'
'71': '84605'
'72': '84743'
'73': '84757'
'74': '84768'
'75': '84788'
'76': '84817'
'77': '85027'
'78': '85038'
'79': '85039'
splits:
- name: train
num_bytes: 86231214.4
num_examples: 1600
- name: test
num_bytes: 21669535.0
num_examples: 400
download_size: 107649160
dataset_size: 107900749.4
---
# Dataset Card for "bw_spec_cls_80_36"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/etorofu_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of etorofu (Kantai Collection)
This is the dataset of etorofu (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `braid, red_hair, twin_braids, purple_eyes, thick_eyebrows, bob_cut, hat, white_headwear, short_hair, gradient_hair, sailor_hat, multicolored_hair, ribbon, side_braid, blonde_hair, blue_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 409.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etorofu_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 276.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etorofu_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1094 | 587.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etorofu_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 379.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etorofu_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1094 | 766.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etorofu_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/etorofu_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, playboy_bunny, rabbit_ears, detached_collar, fake_animal_ears, solo, strapless_leotard, white_gloves, looking_at_viewer, simple_background, white_background, wrist_cuffs, rabbit_tail, black_pantyhose, cowboy_shot, adapted_costume, blue_leotard, bowtie, covered_navel, small_breasts |
| 1 | 23 |  |  |  |  |  | 1girl, blue_neckerchief, blue_sailor_collar, blue_skirt, pleated_skirt, serafuku, solo, bike_shorts, long_sleeves, looking_at_viewer, shorts_under_skirt, white_gloves, open_mouth, cowboy_shot, white_background, simple_background |
| 2 | 14 |  |  |  |  |  | 1girl, bike_shorts, black_socks, blue_neckerchief, blue_sailor_collar, blue_skirt, long_sleeves, pleated_skirt, serafuku, shorts_under_skirt, solo, white_background, white_gloves, simple_background, full_body, open_mouth, looking_at_viewer |
| 3 | 12 |  |  |  |  |  | 1girl, blue_sailor_collar, serafuku, solo, upper_body, looking_at_viewer, blue_neckerchief, white_gloves, open_mouth, simple_background, white_background, long_sleeves, smile |
| 4 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, white_bikini, blush, solo, flat_chest, navel, cowboy_shot, micro_bikini, side-tie_bikini_bottom, simple_background, white_background, collarbone, white_gloves |
| 5 | 8 |  |  |  |  |  | 1girl, simple_background, solo, white_background, overalls, short_sleeves, alternate_costume, white_shirt, blush, dress, open_mouth, holding, orange_hair, shopping_bag, upper_body |
| 6 | 5 |  |  |  |  |  | 1girl, black_skirt, solo, white_shirt, bag, full_body, looking_at_viewer, official_alternate_costume, rubber_boots, simple_background, yellow_footwear, pink_umbrella, polka_dot, socks, striped_shirt, puffy_short_sleeves, white_background |
| 7 | 6 |  |  |  |  |  | 1girl, alternate_costume, bag, smile, jacket, long_sleeves, open_mouth, solo, suspender_skirt, plaid_skirt, sweater, blue_skirt, full_body, looking_at_viewer, mary_janes, pleated_skirt, simple_background, socks, white_background, white_shirt |
| 8 | 12 |  |  |  |  |  | 1girl, wide_sleeves, yukata, long_sleeves, solo, obi, smile, alternate_costume, checkered_kimono, open_mouth, blue_kimono, cotton_candy, holding, white_background, food, looking_at_viewer, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | playboy_bunny | rabbit_ears | detached_collar | fake_animal_ears | solo | strapless_leotard | white_gloves | looking_at_viewer | simple_background | white_background | wrist_cuffs | rabbit_tail | black_pantyhose | cowboy_shot | adapted_costume | blue_leotard | bowtie | covered_navel | small_breasts | blue_neckerchief | blue_sailor_collar | blue_skirt | pleated_skirt | serafuku | bike_shorts | long_sleeves | shorts_under_skirt | open_mouth | black_socks | full_body | upper_body | smile | white_bikini | blush | flat_chest | navel | micro_bikini | side-tie_bikini_bottom | collarbone | overalls | short_sleeves | alternate_costume | white_shirt | dress | holding | orange_hair | shopping_bag | black_skirt | bag | official_alternate_costume | rubber_boots | yellow_footwear | pink_umbrella | polka_dot | socks | striped_shirt | puffy_short_sleeves | jacket | suspender_skirt | plaid_skirt | sweater | mary_janes | wide_sleeves | yukata | obi | checkered_kimono | blue_kimono | cotton_candy | food |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------------|:------------------|:-------------------|:-------|:--------------------|:---------------|:--------------------|:--------------------|:-------------------|:--------------|:--------------|:------------------|:--------------|:------------------|:---------------|:---------|:----------------|:----------------|:-------------------|:---------------------|:-------------|:----------------|:-----------|:--------------|:---------------|:---------------------|:-------------|:--------------|:------------|:-------------|:--------|:---------------|:--------|:-------------|:--------|:---------------|:-------------------------|:-------------|:-----------|:----------------|:--------------------|:--------------|:--------|:----------|:--------------|:---------------|:--------------|:------|:-----------------------------|:---------------|:------------------|:----------------|:------------|:--------|:----------------|:----------------------|:---------|:------------------|:--------------|:----------|:-------------|:---------------|:---------|:------|:-------------------|:--------------|:---------------|:-------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | | | | | X | | X | X | X | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | | | | X | | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | | | | X | | X | X | X | X | | | | | | | | | | X | X | | | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | | X | | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | X | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | | X | | | X | X | X | | | | | | | | | | | | X | X | | | X | | X | | X | | X | | | | | | | | | | X | X | | | | | | X | | | | | | X | | | X | X | X | X | X | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
maghwa/OpenHermes-2-AR-10K-44-880k-870k | ---
dataset_info:
features:
- name: skip_prompt_formatting
dtype: 'null'
- name: language
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: views
dtype: float64
- name: model
dtype: 'null'
- name: source
dtype: string
- name: idx
dtype: 'null'
- name: category
dtype: 'null'
- name: model_name
dtype: 'null'
- name: hash
dtype: 'null'
- name: title
dtype: 'null'
- name: conversations
dtype: string
- name: topic
dtype: 'null'
- name: id
dtype: 'null'
- name: system_prompt
dtype: 'null'
splits:
- name: train
num_bytes: 28875580
num_examples: 10001
download_size: 11132784
dataset_size: 28875580
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yangtao9009/DIV8K | ---
license: apache-2.0
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_9_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 934
num_examples: 32
download_size: 2050
dataset_size: 934
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_9_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AMead10/fleurs_2_sec_chunks | ---
dataset_info:
features:
- name: audio
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3407519773.999594
num_examples: 13310
- name: test
num_bytes: 378641754.0004057
num_examples: 1479
download_size: 2183139381
dataset_size: 3786161528.0
---
# Dataset Card for "fleurs_2_sec_chunks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20 | ---
pretty_name: Evaluation run of SCE/Mistral-7B-math-ia3-pruned20
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SCE/Mistral-7B-math-ia3-pruned20](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T08:07:52.412937](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20/blob/main/results_2024-01-29T08-07-52.412937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6055660350620623,\n\
\ \"acc_stderr\": 0.033190282510517935,\n \"acc_norm\": 0.6099875699478283,\n\
\ \"acc_norm_stderr\": 0.03385986318812193,\n \"mc1\": 0.5201958384332925,\n\
\ \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6773630200722127,\n\
\ \"mc2_stderr\": 0.015189227668395784\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.01441398839699608,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n\
\ \"acc_stderr\": 0.004743808792037865,\n \"acc_norm\": 0.8441545508862777,\n\
\ \"acc_norm_stderr\": 0.003619674864035016\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n\
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707778,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707778\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.0145513105681437,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.0145513105681437\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.016223533510365117,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.016223533510365117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.012682016335646666,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.012682016335646666\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085644,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5201958384332925,\n\
\ \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6773630200722127,\n\
\ \"mc2_stderr\": 0.015189227668395784\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41925701288855194,\n \
\ \"acc_stderr\": 0.013591720959042115\n }\n}\n```"
repo_url: https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|arc:challenge|25_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|gsm8k|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hellaswag|10_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T08-07-52.412937.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- '**/details_harness|winogrande|5_2024-01-29T08-07-52.412937.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T08-07-52.412937.parquet'
- config_name: results
data_files:
- split: 2024_01_29T08_07_52.412937
path:
- results_2024-01-29T08-07-52.412937.parquet
- split: latest
path:
- results_2024-01-29T08-07-52.412937.parquet
---
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-pruned20](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T08:07:52.412937](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20/blob/main/results_2024-01-29T08-07-52.412937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6055660350620623,
"acc_stderr": 0.033190282510517935,
"acc_norm": 0.6099875699478283,
"acc_norm_stderr": 0.03385986318812193,
"mc1": 0.5201958384332925,
"mc1_stderr": 0.017489216849737053,
"mc2": 0.6773630200722127,
"mc2_stderr": 0.015189227668395784
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.01441398839699608,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6550487950607449,
"acc_stderr": 0.004743808792037865,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.003619674864035016
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849723,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849723
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.025242770987126184,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.025242770987126184
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707778,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707778
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.0145513105681437,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.0145513105681437
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.016223533510365117,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.016223533510365117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646666,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646666
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085644,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5201958384332925,
"mc1_stderr": 0.017489216849737053,
"mc2": 0.6773630200722127,
"mc2_stderr": 0.015189227668395784
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.41925701288855194,
"acc_stderr": 0.013591720959042115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
medmac01/OpenHermes-2-AR-20K-2 | ---
dataset_info:
features:
- name: title
dtype: 'null'
- name: conversations
dtype: string
- name: hash
dtype: 'null'
- name: source
dtype: string
- name: custom_instruction
dtype: 'null'
- name: views
dtype: float64
- name: model_name
dtype: 'null'
- name: category
dtype: string
- name: model
dtype: 'null'
- name: idx
dtype: 'null'
- name: language
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: skip_prompt_formatting
dtype: bool
- name: id
dtype: string
- name: topic
dtype: 'null'
- name: avatarUrl
dtype: 'null'
splits:
- name: train
num_bytes: 51795057
num_examples: 20001
download_size: 22345734
dataset_size: 51795057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jamestalentium/dialogsum_100_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: topic
dtype: string
splits:
- name: test
num_bytes: 1353776.49
num_examples: 1485
download_size: 328916
dataset_size: 1353776.49
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "dialogsum_100_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cdraxler/sv_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 292351437
num_examples: 1892723
download_size: 161955796
dataset_size: 292351437
---
# Dataset Card for "sv_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sakil/DPO_dataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_AA051610__A13 | ---
pretty_name: Evaluation run of AA051610/A13
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/A13](https://huggingface.co/AA051610/A13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A13\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T14:08:54.129715](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A13/blob/main/results_2023-12-13T14-08-54.129715.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6920824964752141,\n\
\ \"acc_stderr\": 0.03046911688711296,\n \"acc_norm\": 0.6967692736238253,\n\
\ \"acc_norm_stderr\": 0.031060503746157857,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5325324692481855,\n\
\ \"mc2_stderr\": 0.015130320422933614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522075,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n\
\ \"acc_stderr\": 0.004852658876775387,\n \"acc_norm\": 0.8169687313284206,\n\
\ \"acc_norm_stderr\": 0.0038590186619619966\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.038783523721386236,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.038783523721386236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5211640211640212,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.5211640211640212,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8612903225806452,\n \"acc_stderr\": 0.01966296132141402,\n \"\
acc_norm\": 0.8612903225806452,\n \"acc_norm_stderr\": 0.01966296132141402\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7487179487179487,\n \"acc_stderr\": 0.021992016662370564,\n\
\ \"acc_norm\": 0.7487179487179487,\n \"acc_norm_stderr\": 0.021992016662370564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.02606431340630453,\n \
\ \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.02606431340630453\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588963,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8823529411764706,\n \"acc_stderr\": 0.022613286601132012,\n \"\
acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.022613286601132012\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878474,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878474\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625838,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625838\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.01183295423930574,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.01183295423930574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n\
\ \"acc_stderr\": 0.016295332328155814,\n \"acc_norm\": 0.3877094972067039,\n\
\ \"acc_norm_stderr\": 0.016295332328155814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5234680573663625,\n\
\ \"acc_stderr\": 0.012756161942523355,\n \"acc_norm\": 0.5234680573663625,\n\
\ \"acc_norm_stderr\": 0.012756161942523355\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n\
\ \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n\
\ \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101696,\n\
\ \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101696\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n\
\ \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n\
\ \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n\
\ \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n\
\ \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n\
\ \"mc2\": 0.5325324692481855,\n \"mc2_stderr\": 0.015130320422933614\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n\
\ \"acc_stderr\": 0.011168120593569572\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.5269143290371494,\n \"acc_stderr\": 0.013752517189717468\n\
\ }\n}\n```"
repo_url: https://huggingface.co/AA051610/A13
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|arc:challenge|25_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|gsm8k|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hellaswag|10_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T14-08-54.129715.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- '**/details_harness|winogrande|5_2023-12-13T14-08-54.129715.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T14-08-54.129715.parquet'
- config_name: results
data_files:
- split: 2023_12_13T14_08_54.129715
path:
- results_2023-12-13T14-08-54.129715.parquet
- split: latest
path:
- results_2023-12-13T14-08-54.129715.parquet
---
# Dataset Card for Evaluation run of AA051610/A13
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A13](https://huggingface.co/AA051610/A13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A13",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:08:54.129715](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A13/blob/main/results_2023-12-13T14-08-54.129715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6920824964752141,
"acc_stderr": 0.03046911688711296,
"acc_norm": 0.6967692736238253,
"acc_norm_stderr": 0.031060503746157857,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5325324692481855,
"mc2_stderr": 0.015130320422933614
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522075,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6164110734913364,
"acc_stderr": 0.004852658876775387,
"acc_norm": 0.8169687313284206,
"acc_norm_stderr": 0.0038590186619619966
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.038783523721386236,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.038783523721386236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5211640211640212,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.5211640211640212,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8612903225806452,
"acc_stderr": 0.01966296132141402,
"acc_norm": 0.8612903225806452,
"acc_norm_stderr": 0.01966296132141402
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7487179487179487,
"acc_stderr": 0.021992016662370564,
"acc_norm": 0.7487179487179487,
"acc_norm_stderr": 0.021992016662370564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.02606431340630453,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.02606431340630453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588963,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.022613286601132012,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.022613286601132012
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878474,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878474
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625838,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625838
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.01183295423930574,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.01183295423930574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155814,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5234680573663625,
"acc_stderr": 0.012756161942523355,
"acc_norm": 0.5234680573663625,
"acc_norm_stderr": 0.012756161942523355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101696,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101696
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5325324692481855,
"mc2_stderr": 0.015130320422933614
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.5269143290371494,
"acc_stderr": 0.013752517189717468
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
razhan/diyako_hashemi_yt | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 8024831110.886
num_examples: 24207
download_size: 6774073877
dataset_size: 8024831110.886
---
# Dataset Card for "diyako_hashemi_yt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/thedemongirlnextdoor | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of The Demon Girl Next Door
This is the image base of bangumi The Demon Girl Next Door, we detected 18 characters, 3728 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1497 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 43 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 139 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 149 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 96 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 18 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 8 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 116 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 364 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 823 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 136 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 46 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 5 | [Download](15/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 16 | 105 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 112 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
matallanas/AbduRozik | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4418800.0
num_examples: 22
download_size: 4418930
dataset_size: 4418800.0
---
# Dataset Card for "AbduRozik"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-inverse-scaling__41-inverse-scaling__41-10b85d-1679259341 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/41
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: inverse-scaling/41
dataset_config: inverse-scaling--41
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: inverse-scaling/41
* Config: inverse-scaling--41
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
one-sec-cv12/chunk_84 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20988697104.625
num_examples: 218523
download_size: 18931779804
dataset_size: 20988697104.625
---
# Dataset Card for "chunk_84"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-elementary_mathematics-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 76525
num_examples: 378
download_size: 46293
dataset_size: 76525
---
# Dataset Card for "mmlu-elementary_mathematics-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yuchong/us-breast-cancer | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 42431652.0
num_examples: 130
download_size: 10004141
dataset_size: 42431652.0
---
# Dataset Card for "us-breast-cancer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-e26065-64936145532 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: minhtoan/t5-finetune-cnndaily-news
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: minhtoan/t5-finetune-cnndaily-news
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@https://huggingface.co/Sini](https://huggingface.co/https://huggingface.co/Sini) for evaluating this model. |
Imadken/platypus_Lamini_formatted | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 58095446.56290887
num_examples: 23117
download_size: 27890525
dataset_size: 58095446.56290887
---
For test split, please use the test splits available in the following datasets:
- **Imadken/Lamini_formatted**
- **Imadken/platypus_formatted**
both of tests splits represents 10% of the original data
---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 59540191
num_examples: 23693
download_size: 31190675
dataset_size: 59540191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
size_categories:
- 10K<n<100K
---
|
davanstrien/MAMe | ---
dataset_info:
config_name: '256'
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Albumen photograph
'1': Bronze
'2': Ceramic
'3': Clay
'4': Engraving
'5': Etching
'6': Faience
'7': Glass
'8': Gold
'9': Graphite
'10': Hand-colored engraving
'11': Hand-colored etching
'12': Iron
'13': Ivory
'14': Limestone
'15': Lithograph
'16': Marble
'17': Oil on canvas
'18': Pen and brown ink
'19': Polychromed wood
'20': Porcelain
'21': Silk and metal thread
'22': Silver
'23': Steel
'24': Wood
'25': Wood engraving
'26': Woodblock
'27': Woodcut
'28': Woven fabric
- name: Museum
dtype: string
- name: Museum-based instance ID
dtype: string
- name: Width
dtype: float32
- name: Height
dtype: float32
- name: Product size
dtype: float32
- name: Aspect ratio
dtype: float32
splits:
- name: train
num_bytes: 441294458.5
num_examples: 20300
- name: validation
num_bytes: 26810584.95
num_examples: 1450
- name: test
num_bytes: 362018531.291
num_examples: 15657
download_size: 719959312
dataset_size: 830123574.7409999
builder_config:
config_name: '256'
data_files:
- split: train
pattern: 256/train-*
- split: validation
pattern: 256/validation-*
- split: test
pattern: 256/test-*
---
# Dataset Card for "MAMe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/openhermes-dev__mistralai_Mixtral-8x7B-Instruct-v0.1__temp | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: model
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
- name: title
dtype: string
- name: topic
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: idx
dtype: 'null'
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: custom_instruction
dtype: bool
- name: language
dtype: string
- name: category
dtype: string
- name: id
dtype: string
- name: model_name
dtype: string
- name: prompt
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: token_length
dtype: int64
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train
num_bytes: 3205678938
num_examples: 600000
download_size: 1549168640
dataset_size: 3205678938
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nma/lm_resume_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 714031412
num_examples: 107083
- name: train
num_bytes: 2856345596
num_examples: 428365
download_size: 1035174948
dataset_size: 3570377008
---
# Dataset Card for "lm_resume_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/semeval-task-8-a-multi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: model
dtype: string
- name: source
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 317913694
num_examples: 120691
- name: val
num_bytes: 134829282
num_examples: 51726
- name: test
num_bytes: 8790338
num_examples: 4000
download_size: 265441677
dataset_size: 461533314
---
# Dataset Card for "semeval-task-8-a-multi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/textvqa_mini_validation_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 18249
num_examples: 100
download_size: 9412
dataset_size: 18249
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
liuyanchen1015/MULTI_VALUE_rte_demonstrative_no_number | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 60455
num_examples: 116
- name: train
num_bytes: 55337
num_examples: 103
download_size: 86937
dataset_size: 115792
---
# Dataset Card for "MULTI_VALUE_rte_demonstrative_no_number"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
veroniccccccha/reper3 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 250354042.0
num_examples: 5
download_size: 18883304
dataset_size: 250354042.0
---
# Dataset Card for "reper3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dl4phys/top_tagging_nsubjettiness | ---
license: cc-by-4.0
---
|
LongshenOu/lyric-trans-en2zh-data | ---
license: cc-by-nc-sa-4.0
---
|
mteb/tweet_sentiment_extraction | ---
language:
- en
--- |
lonestar108/chat | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 13594
num_examples: 27
- name: test
num_bytes: 7433
num_examples: 8
- name: validate
num_bytes: 942
num_examples: 3
download_size: 29119
dataset_size: 21969
---
# Dataset Card for "new_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.