datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
noahshinn/cifar100_2_to_100_constant_size_dataset | ---
configs:
- config_name: default
data_files:
- split: cifar100_2
path: data/cifar100_2-*
- split: cifar100_3
path: data/cifar100_3-*
- split: cifar100_4
path: data/cifar100_4-*
- split: cifar100_5
path: data/cifar100_5-*
- split: cifar100_6
path: data/cifar100_6-*
- split: cifar100_7
path: data/cifar100_7-*
- split: cifar100_8
path: data/cifar100_8-*
- split: cifar100_9
path: data/cifar100_9-*
- split: cifar100_10
path: data/cifar100_10-*
- split: cifar100_11
path: data/cifar100_11-*
- split: cifar100_12
path: data/cifar100_12-*
- split: cifar100_13
path: data/cifar100_13-*
- split: cifar100_14
path: data/cifar100_14-*
- split: cifar100_15
path: data/cifar100_15-*
- split: cifar100_16
path: data/cifar100_16-*
- split: cifar100_17
path: data/cifar100_17-*
- split: cifar100_18
path: data/cifar100_18-*
- split: cifar100_19
path: data/cifar100_19-*
- split: cifar100_20
path: data/cifar100_20-*
- split: cifar100_21
path: data/cifar100_21-*
- split: cifar100_22
path: data/cifar100_22-*
- split: cifar100_23
path: data/cifar100_23-*
- split: cifar100_24
path: data/cifar100_24-*
- split: cifar100_25
path: data/cifar100_25-*
- split: cifar100_26
path: data/cifar100_26-*
- split: cifar100_27
path: data/cifar100_27-*
- split: cifar100_28
path: data/cifar100_28-*
- split: cifar100_29
path: data/cifar100_29-*
- split: cifar100_30
path: data/cifar100_30-*
- split: cifar100_31
path: data/cifar100_31-*
- split: cifar100_32
path: data/cifar100_32-*
- split: cifar100_33
path: data/cifar100_33-*
- split: cifar100_34
path: data/cifar100_34-*
- split: cifar100_35
path: data/cifar100_35-*
- split: cifar100_36
path: data/cifar100_36-*
- split: cifar100_37
path: data/cifar100_37-*
- split: cifar100_38
path: data/cifar100_38-*
- split: cifar100_39
path: data/cifar100_39-*
- split: cifar100_40
path: data/cifar100_40-*
- split: cifar100_41
path: data/cifar100_41-*
- split: cifar100_42
path: data/cifar100_42-*
- split: cifar100_43
path: data/cifar100_43-*
- split: cifar100_44
path: data/cifar100_44-*
- split: cifar100_45
path: data/cifar100_45-*
- split: cifar100_46
path: data/cifar100_46-*
- split: cifar100_47
path: data/cifar100_47-*
- split: cifar100_48
path: data/cifar100_48-*
- split: cifar100_49
path: data/cifar100_49-*
- split: cifar100_50
path: data/cifar100_50-*
- split: cifar100_51
path: data/cifar100_51-*
- split: cifar100_52
path: data/cifar100_52-*
- split: cifar100_53
path: data/cifar100_53-*
- split: cifar100_54
path: data/cifar100_54-*
- split: cifar100_55
path: data/cifar100_55-*
- split: cifar100_56
path: data/cifar100_56-*
- split: cifar100_57
path: data/cifar100_57-*
- split: cifar100_58
path: data/cifar100_58-*
- split: cifar100_59
path: data/cifar100_59-*
- split: cifar100_60
path: data/cifar100_60-*
- split: cifar100_61
path: data/cifar100_61-*
- split: cifar100_62
path: data/cifar100_62-*
- split: cifar100_63
path: data/cifar100_63-*
- split: cifar100_64
path: data/cifar100_64-*
- split: cifar100_65
path: data/cifar100_65-*
- split: cifar100_66
path: data/cifar100_66-*
- split: cifar100_67
path: data/cifar100_67-*
- split: cifar100_68
path: data/cifar100_68-*
- split: cifar100_69
path: data/cifar100_69-*
- split: cifar100_70
path: data/cifar100_70-*
- split: cifar100_71
path: data/cifar100_71-*
- split: cifar100_72
path: data/cifar100_72-*
- split: cifar100_73
path: data/cifar100_73-*
- split: cifar100_74
path: data/cifar100_74-*
- split: cifar100_75
path: data/cifar100_75-*
- split: cifar100_76
path: data/cifar100_76-*
- split: cifar100_77
path: data/cifar100_77-*
- split: cifar100_78
path: data/cifar100_78-*
- split: cifar100_79
path: data/cifar100_79-*
- split: cifar100_80
path: data/cifar100_80-*
- split: cifar100_81
path: data/cifar100_81-*
- split: cifar100_82
path: data/cifar100_82-*
- split: cifar100_83
path: data/cifar100_83-*
- split: cifar100_84
path: data/cifar100_84-*
- split: cifar100_85
path: data/cifar100_85-*
- split: cifar100_86
path: data/cifar100_86-*
- split: cifar100_87
path: data/cifar100_87-*
- split: cifar100_88
path: data/cifar100_88-*
- split: cifar100_89
path: data/cifar100_89-*
- split: cifar100_90
path: data/cifar100_90-*
- split: cifar100_91
path: data/cifar100_91-*
- split: cifar100_92
path: data/cifar100_92-*
- split: cifar100_93
path: data/cifar100_93-*
- split: cifar100_94
path: data/cifar100_94-*
- split: cifar100_95
path: data/cifar100_95-*
- split: cifar100_96
path: data/cifar100_96-*
- split: cifar100_97
path: data/cifar100_97-*
- split: cifar100_98
path: data/cifar100_98-*
- split: cifar100_99
path: data/cifar100_99-*
- split: cifar100_100
path: data/cifar100_100-*
dataset_info:
features:
- name: img
dtype: image
- name: fine_label
dtype: int64
- name: coarse_label
dtype: int64
splits:
- name: cifar100_2
num_bytes: 2225239.0
num_examples: 1000
- name: cifar100_3
num_bytes: 2259599.0
num_examples: 999
- name: cifar100_4
num_bytes: 2286175.0
num_examples: 1000
- name: cifar100_5
num_bytes: 2302471.0
num_examples: 1000
- name: cifar100_6
num_bytes: 2283078.0
num_examples: 1000
- name: cifar100_7
num_bytes: 2299875.875
num_examples: 1001
- name: cifar100_8
num_bytes: 2293253.0
num_examples: 1000
- name: cifar100_9
num_bytes: 2308711.0
num_examples: 1000
- name: cifar100_10
num_bytes: 2277674.0
num_examples: 1000
- name: cifar100_11
num_bytes: 2262994.0
num_examples: 999
- name: cifar100_12
num_bytes: 2263991.0
num_examples: 1000
- name: cifar100_13
num_bytes: 2251367.0
num_examples: 1000
- name: cifar100_14
num_bytes: 2266712.0
num_examples: 1000
- name: cifar100_15
num_bytes: 2285722.0
num_examples: 998
- name: cifar100_16
num_bytes: 2295947.0
num_examples: 1000
- name: cifar100_17
num_bytes: 2284467.0
num_examples: 999
- name: cifar100_18
num_bytes: 2294945.0
num_examples: 1000
- name: cifar100_19
num_bytes: 2285368.0
num_examples: 999
- name: cifar100_20
num_bytes: 2261078.0
num_examples: 1000
- name: cifar100_21
num_bytes: 2244234.0
num_examples: 999
- name: cifar100_22
num_bytes: 2261421.0
num_examples: 999
- name: cifar100_23
num_bytes: 2257559.0
num_examples: 1000
- name: cifar100_24
num_bytes: 2247805.0
num_examples: 997
- name: cifar100_25
num_bytes: 2240527.0
num_examples: 1000
- name: cifar100_26
num_bytes: 2229397.0
num_examples: 999
- name: cifar100_27
num_bytes: 2249080.0
num_examples: 1000
- name: cifar100_28
num_bytes: 2245906.0
num_examples: 998
- name: cifar100_29
num_bytes: 2230364.0
num_examples: 998
- name: cifar100_30
num_bytes: 2220362.0
num_examples: 998
- name: cifar100_31
num_bytes: 2226478.0
num_examples: 999
- name: cifar100_32
num_bytes: 2233878.0
num_examples: 999
- name: cifar100_33
num_bytes: 2233027.0
num_examples: 998
- name: cifar100_34
num_bytes: 2228180.0
num_examples: 996
- name: cifar100_35
num_bytes: 2231362.0
num_examples: 995
- name: cifar100_36
num_bytes: 2233144.0
num_examples: 997
- name: cifar100_37
num_bytes: 2243900.0
num_examples: 999
- name: cifar100_38
num_bytes: 2246473.0
num_examples: 999
- name: cifar100_39
num_bytes: 2236395.0
num_examples: 994
- name: cifar100_40
num_bytes: 2251901.0
num_examples: 1000
- name: cifar100_41
num_bytes: 2233550.0
num_examples: 998
- name: cifar100_42
num_bytes: 2223853.0
num_examples: 996
- name: cifar100_43
num_bytes: 2231828.0
num_examples: 1000
- name: cifar100_44
num_bytes: 2240803.0
num_examples: 997
- name: cifar100_45
num_bytes: 2255019.0
num_examples: 999
- name: cifar100_46
num_bytes: 2247785.0
num_examples: 997
- name: cifar100_47
num_bytes: 2245971.0
num_examples: 999
- name: cifar100_48
num_bytes: 2256391.0
num_examples: 995
- name: cifar100_49
num_bytes: 2260884.0
num_examples: 998
- name: cifar100_50
num_bytes: 2248616.0
num_examples: 1000
- name: cifar100_51
num_bytes: 2244766.0
num_examples: 995
- name: cifar100_52
num_bytes: 2251863.0
num_examples: 999
- name: cifar100_53
num_bytes: 2240318.0
num_examples: 995
- name: cifar100_54
num_bytes: 2241712.0
num_examples: 995
- name: cifar100_55
num_bytes: 2265288.0
num_examples: 1000
- name: cifar100_56
num_bytes: 2242038.0
num_examples: 995
- name: cifar100_57
num_bytes: 2239972.0
num_examples: 995
- name: cifar100_58
num_bytes: 2247974.0
num_examples: 999
- name: cifar100_59
num_bytes: 2249820.875
num_examples: 1001
- name: cifar100_60
num_bytes: 2243773.0
num_examples: 991
- name: cifar100_61
num_bytes: 2245764.0
num_examples: 997
- name: cifar100_62
num_bytes: 2235770.0
num_examples: 998
- name: cifar100_63
num_bytes: 2252900.0
num_examples: 995
- name: cifar100_64
num_bytes: 2246481.0
num_examples: 994
- name: cifar100_65
num_bytes: 2250189.0
num_examples: 997
- name: cifar100_66
num_bytes: 2266965.0
num_examples: 998
- name: cifar100_67
num_bytes: 2261065.0
num_examples: 1000
- name: cifar100_68
num_bytes: 2255291.0
num_examples: 995
- name: cifar100_69
num_bytes: 2253012.0
num_examples: 998
- name: cifar100_70
num_bytes: 2255814.0
num_examples: 998
- name: cifar100_71
num_bytes: 2260155.0
num_examples: 1000
- name: cifar100_72
num_bytes: 2247349.0
num_examples: 998
- name: cifar100_73
num_bytes: 2241562.0
num_examples: 993
- name: cifar100_74
num_bytes: 2232133.0
num_examples: 998
- name: cifar100_75
num_bytes: 2245488.0
num_examples: 999
- name: cifar100_76
num_bytes: 2248830.0
num_examples: 999
- name: cifar100_77
num_bytes: 2243711.0
num_examples: 1000
- name: cifar100_78
num_bytes: 2239671.0
num_examples: 998
- name: cifar100_79
num_bytes: 2225687.0
num_examples: 994
- name: cifar100_80
num_bytes: 2243437.0
num_examples: 998
- name: cifar100_81
num_bytes: 2246395.0
num_examples: 998
- name: cifar100_82
num_bytes: 2257960.75
num_examples: 1002
- name: cifar100_83
num_bytes: 2252038.625
num_examples: 1003
- name: cifar100_84
num_bytes: 2244779.875
num_examples: 1001
- name: cifar100_85
num_bytes: 2241990.0
num_examples: 1000
- name: cifar100_86
num_bytes: 2228242.0
num_examples: 995
- name: cifar100_87
num_bytes: 2259900.0
num_examples: 998
- name: cifar100_88
num_bytes: 2250864.0
num_examples: 997
- name: cifar100_89
num_bytes: 2258215.0
num_examples: 999
- name: cifar100_90
num_bytes: 2267190.0
num_examples: 1000
- name: cifar100_91
num_bytes: 2237768.0
num_examples: 1000
- name: cifar100_92
num_bytes: 2236553.0
num_examples: 998
- name: cifar100_93
num_bytes: 2240125.0
num_examples: 998
- name: cifar100_94
num_bytes: 2223666.0
num_examples: 993
- name: cifar100_95
num_bytes: 2231727.0
num_examples: 996
- name: cifar100_96
num_bytes: 2225043.0
num_examples: 997
- name: cifar100_97
num_bytes: 2244993.0
num_examples: 1000
- name: cifar100_98
num_bytes: 2252969.875
num_examples: 1001
- name: cifar100_99
num_bytes: 2251557.875
num_examples: 1001
- name: cifar100_100
num_bytes: 2255756.0
num_examples: 1000
download_size: 234543230
dataset_size: 222851292.75
---
# Dataset Card for "cifar100_2_to_100_constant_size_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lyon-nlp/mteb-fr-reranking-alloprof-s2p | ---
dataset_info:
features:
- name: query
dtype: string
- name: negative
sequence: string
- name: positive
sequence: string
splits:
- name: train
num_bytes: 391344098
num_examples: 9264
- name: test
num_bytes: 96357308
num_examples: 2316
download_size: 227764827
dataset_size: 487701406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
gdurkin/calibrated_3ch_orig_train | ---
dataset_info:
features:
- name: label
dtype: image
- name: pixel_values
dtype: image
splits:
- name: train
num_bytes: 78982074.0
num_examples: 200
download_size: 78961779
dataset_size: 78982074.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AhmedBou/Methods | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402 | ---
pretty_name: Evaluation run of hon9kon9ize/CantoneseLLM-6B-preview202402
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hon9kon9ize/CantoneseLLM-6B-preview202402](https://huggingface.co/hon9kon9ize/CantoneseLLM-6B-preview202402)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T22:17:17.351322](https://huggingface.co/datasets/open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402/blob/main/results_2024-02-09T22-17-17.351322.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6242838736375242,\n\
\ \"acc_stderr\": 0.03228004222766128,\n \"acc_norm\": 0.6315704040247714,\n\
\ \"acc_norm_stderr\": 0.032937481575230375,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4225788726241693,\n\
\ \"mc2_stderr\": 0.014623978270427003\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5221843003412969,\n \"acc_stderr\": 0.014597001927076133,\n\
\ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670444\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5626369249153556,\n\
\ \"acc_stderr\": 0.004950472918523313,\n \"acc_norm\": 0.758016331408086,\n\
\ \"acc_norm_stderr\": 0.004274091605308127\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374766,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374766\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382175,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215286,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7394957983193278,\n \"acc_stderr\": 0.02851025151234192,\n \
\ \"acc_norm\": 0.7394957983193278,\n \"acc_norm_stderr\": 0.02851025151234192\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073382,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073382\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990936,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990936\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296417,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296417\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.01650157930686167,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.01650157930686167\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n\
\ \"acc_stderr\": 0.012767457253930647,\n \"acc_norm\": 0.4895697522816167,\n\
\ \"acc_norm_stderr\": 0.012767457253930647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824862,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824862\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4225788726241693,\n\
\ \"mc2_stderr\": 0.014623978270427003\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993376\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3070507960576194,\n \
\ \"acc_stderr\": 0.012705685723131703\n }\n}\n```"
repo_url: https://huggingface.co/hon9kon9ize/CantoneseLLM-6B-preview202402
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-17-17.351322.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-17-17.351322.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- '**/details_harness|winogrande|5_2024-02-09T22-17-17.351322.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T22-17-17.351322.parquet'
- config_name: results
data_files:
- split: 2024_02_09T22_17_17.351322
path:
- results_2024-02-09T22-17-17.351322.parquet
- split: latest
path:
- results_2024-02-09T22-17-17.351322.parquet
---
# Dataset Card for Evaluation run of hon9kon9ize/CantoneseLLM-6B-preview202402
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hon9kon9ize/CantoneseLLM-6B-preview202402](https://huggingface.co/hon9kon9ize/CantoneseLLM-6B-preview202402) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T22:17:17.351322](https://huggingface.co/datasets/open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402/blob/main/results_2024-02-09T22-17-17.351322.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6242838736375242,
"acc_stderr": 0.03228004222766128,
"acc_norm": 0.6315704040247714,
"acc_norm_stderr": 0.032937481575230375,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4225788726241693,
"mc2_stderr": 0.014623978270427003
},
"harness|arc:challenge|25": {
"acc": 0.5221843003412969,
"acc_stderr": 0.014597001927076133,
"acc_norm": 0.5563139931740614,
"acc_norm_stderr": 0.014518421825670444
},
"harness|hellaswag|10": {
"acc": 0.5626369249153556,
"acc_stderr": 0.004950472918523313,
"acc_norm": 0.758016331408086,
"acc_norm_stderr": 0.004274091605308127
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374766,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374766
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382175,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7394957983193278,
"acc_stderr": 0.02851025151234192,
"acc_norm": 0.7394957983193278,
"acc_norm_stderr": 0.02851025151234192
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073382,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073382
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990936,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990936
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296417,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.01650157930686167,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.01650157930686167
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4895697522816167,
"acc_stderr": 0.012767457253930647,
"acc_norm": 0.4895697522816167,
"acc_norm_stderr": 0.012767457253930647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824862,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824862
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4225788726241693,
"mc2_stderr": 0.014623978270427003
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.012310515810993376
},
"harness|gsm8k|5": {
"acc": 0.3070507960576194,
"acc_stderr": 0.012705685723131703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gsstein/0-percent-human-dataset-opt-og | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 86003094
num_examples: 15326
- name: test
num_bytes: 3054268
num_examples: 576
- name: validation
num_bytes: 3251537
num_examples: 576
download_size: 57085020
dataset_size: 92308899
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
GHOFRANEE/LLM_DATASET_bbox | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1428578
num_examples: 155
download_size: 584470
dataset_size: 1428578
---
# Dataset Card for "LLM_DATASET_bbox"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full | ---
pretty_name: Evaluation run of alignment-handbook/zephyr-7b-dpo-full
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alignment-handbook/zephyr-7b-dpo-full](https://huggingface.co/alignment-handbook/zephyr-7b-dpo-full)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T01:29:43.310904](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full/blob/main/results_2024-04-08T01-29-43.310904.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5926699986918813,\n\
\ \"acc_stderr\": 0.03321334145058982,\n \"acc_norm\": 0.6004002100600775,\n\
\ \"acc_norm_stderr\": 0.03393325403898488,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494877,\n \"mc2\": 0.4740788248392144,\n\
\ \"mc2_stderr\": 0.01579474521827581\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.01433223630679015,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142824\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6531567416849233,\n\
\ \"acc_stderr\": 0.004749926091672248,\n \"acc_norm\": 0.8444532961561442,\n\
\ \"acc_norm_stderr\": 0.0036168436913607653\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n \
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209828,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209828\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837116,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837116\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424282,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381957,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381957\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528151,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528151\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727682,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727682\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494877,\n \"mc2\": 0.4740788248392144,\n\
\ \"mc2_stderr\": 0.01579474521827581\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18574677786201668,\n \
\ \"acc_stderr\": 0.010712298902729072\n }\n}\n```"
repo_url: https://huggingface.co/alignment-handbook/zephyr-7b-dpo-full
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|arc:challenge|25_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|gsm8k|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hellaswag|10_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T01-29-43.310904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T01-29-43.310904.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- '**/details_harness|winogrande|5_2024-04-08T01-29-43.310904.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T01-29-43.310904.parquet'
- config_name: results
data_files:
- split: 2024_04_08T01_29_43.310904
path:
- results_2024-04-08T01-29-43.310904.parquet
- split: latest
path:
- results_2024-04-08T01-29-43.310904.parquet
---
# Dataset Card for Evaluation run of alignment-handbook/zephyr-7b-dpo-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alignment-handbook/zephyr-7b-dpo-full](https://huggingface.co/alignment-handbook/zephyr-7b-dpo-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T01:29:43.310904](https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full/blob/main/results_2024-04-08T01-29-43.310904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5926699986918813,
"acc_stderr": 0.03321334145058982,
"acc_norm": 0.6004002100600775,
"acc_norm_stderr": 0.03393325403898488,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494877,
"mc2": 0.4740788248392144,
"mc2_stderr": 0.01579474521827581
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.01433223630679015,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142824
},
"harness|hellaswag|10": {
"acc": 0.6531567416849233,
"acc_stderr": 0.004749926091672248,
"acc_norm": 0.8444532961561442,
"acc_norm_stderr": 0.0036168436913607653
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209828,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209828
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837116,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837116
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424282,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381957,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381957
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528151,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528151
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727682,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727682
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494877,
"mc2": 0.4740788248392144,
"mc2_stderr": 0.01579474521827581
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.18574677786201668,
"acc_stderr": 0.010712298902729072
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
semeru/causal-se | ---
license: apache-2.0
---
|
CyberHarem/taniguchi_harumi_citrus | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Taniguchi Harumi
This is the dataset of Taniguchi Harumi, containing 72 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 72 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 173 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 192 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 72 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 72 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 72 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 173 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 173 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 142 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 192 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 192 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/3b11819b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1330
dataset_size: 182
---
# Dataset Card for "3b11819b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vhanbri/dontopennottina | ---
license: openrail
task_categories:
- question-answering
language:
- en
pretty_name: not_tina
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ibranze/araproje_hellaswag_tr_conf_mgpt_bestscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87173
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mgpt_bestscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-d42d3c12-7815006 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: jg/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: jg/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
open-llm-leaderboard/details_abideen__phi2-pro | ---
pretty_name: Evaluation run of abideen/phi2-pro
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/phi2-pro](https://huggingface.co/abideen/phi2-pro) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__phi2-pro\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T13:29:52.820607](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__phi2-pro/blob/main/results_2024-03-21T13-29-52.820607.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/abideen/phi2-pro
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|arc:challenge|25_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|gsm8k|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hellaswag|10_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-29-52.820607.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T13-29-52.820607.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- '**/details_harness|winogrande|5_2024-03-21T13-29-52.820607.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T13-29-52.820607.parquet'
- config_name: results
data_files:
- split: 2024_03_21T13_29_52.820607
path:
- results_2024-03-21T13-29-52.820607.parquet
- split: latest
path:
- results_2024-03-21T13-29-52.820607.parquet
---
# Dataset Card for Evaluation run of abideen/phi2-pro
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/phi2-pro](https://huggingface.co/abideen/phi2-pro) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__phi2-pro",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T13:29:52.820607](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__phi2-pro/blob/main/results_2024-03-21T13-29-52.820607.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subramanya3/shawgpt-youtube-comments | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 42749
num_examples: 50
- name: test
num_bytes: 7974
num_examples: 9
download_size: 27594
dataset_size: 50723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
arpitdvd/Heart_Annotations | ---
license: apache-2.0
---
|
hf-internal-testing/etth1-hourly-batch | ---
license: cc-by-nd-4.0
---
|
HamdanXI/paradetox-preprocess-maskedComments | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en_toxic_comment
dtype: string
- name: en_neutral_comment
dtype: string
- name: edit_ops
sequence:
sequence: string
- name: masked_comment
dtype: string
splits:
- name: train
num_bytes: 6126021
num_examples: 19744
download_size: 2488196
dataset_size: 6126021
---
# Dataset Card for "paradetox-preprocess-maskedComments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a4 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r32_a4](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:18:04.482828](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a4/blob/main/results_2024-02-10T00-18-04.482828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5547509888306777,\n\
\ \"acc_stderr\": 0.03370345349790658,\n \"acc_norm\": 0.5608687368965364,\n\
\ \"acc_norm_stderr\": 0.03442867116165037,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.38132659209343317,\n\
\ \"mc2_stderr\": 0.013760048011688938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6166102370045807,\n\
\ \"acc_stderr\": 0.00485218262127426,\n \"acc_norm\": 0.8242381995618403,\n\
\ \"acc_norm_stderr\": 0.003798395055021539\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.667741935483871,\n \"acc_stderr\": 0.026795560848122804,\n \"\
acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.026795560848122804\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n\
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890474,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890474\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494569,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n\
\ \"acc_stderr\": 0.01550689259464727,\n \"acc_norm\": 0.3128491620111732,\n\
\ \"acc_norm_stderr\": 0.01550689259464727\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n\
\ \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.42046936114732725,\n\
\ \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455502,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455502\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886528,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.38132659209343317,\n\
\ \"mc2_stderr\": 0.013760048011688938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2266868840030326,\n \
\ \"acc_stderr\": 0.01153275800933999\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-18-04.482828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-18-04.482828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- '**/details_harness|winogrande|5_2024-02-10T00-18-04.482828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-18-04.482828.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_18_04.482828
path:
- results_2024-02-10T00-18-04.482828.parquet
- split: latest
path:
- results_2024-02-10T00-18-04.482828.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r32_a4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r32_a4](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r32_a4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:18:04.482828](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r32_a4/blob/main/results_2024-02-10T00-18-04.482828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5547509888306777,
"acc_stderr": 0.03370345349790658,
"acc_norm": 0.5608687368965364,
"acc_norm_stderr": 0.03442867116165037,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.38132659209343317,
"mc2_stderr": 0.013760048011688938
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578278
},
"harness|hellaswag|10": {
"acc": 0.6166102370045807,
"acc_stderr": 0.00485218262127426,
"acc_norm": 0.8242381995618403,
"acc_norm_stderr": 0.003798395055021539
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122804,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122804
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890474,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890474
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494569,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.01550689259464727,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.01550689259464727
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455502,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455502
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886528,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.38132659209343317,
"mc2_stderr": 0.013760048011688938
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.2266868840030326,
"acc_stderr": 0.01153275800933999
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zerolink/zsql-snowflake-dpo | ---
dataset_info:
features:
- name: schema
dtype: string
- name: question
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
- name: weight
dtype: float64
splits:
- name: train
num_bytes: 250333739.2658651
num_examples: 234216
- name: test
num_bytes: 27815928.734134898
num_examples: 26025
download_size: 87415316
dataset_size: 278149668.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
FanChen0116/bus_few4_40x_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 345681
num_examples: 1400
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 45026
dataset_size: 423199
---
# Dataset Card for "bus_few4_40x_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_137 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1171045420.0
num_examples: 228185
download_size: 1199589445
dataset_size: 1171045420.0
---
# Dataset Card for "chunk_137"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ytnjnh11/ytnjnh11 | ---
license: openrail
---
|
kanishka/counterfactual-babylm-pipps_and_keys_to_it_all_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581838721
num_examples: 11634224
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421689270
dataset_size: 637958951
---
# Dataset Card for "counterfactual-babylm-pipps_and_keys_to_it_all_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train50_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 45381
num_examples: 140
- name: validation
num_bytes: 16031
num_examples: 40
download_size: 40329
dataset_size: 61412
---
# Dataset Card for "random_letter_same_length_find_passage_train50_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShenaoZhang/0.001_idpo_noreplacerej_ref_response | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
splits:
- name: train_prefs_1
num_bytes: 164111773
num_examples: 20378
- name: test_prefs_1
num_bytes: 16019213
num_examples: 2000
- name: train_prefs_2
num_bytes: 168516655
num_examples: 20378
- name: test_prefs_2
num_bytes: 16429987
num_examples: 2000
download_size: 201888771
dataset_size: 365077628
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
---
# Dataset Card for "0.001_idpo_noreplacerej_ref_response"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hunzla/omnisonus | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: intention
dtype: string
- name: accent
dtype: string
splits:
- name: train
num_bytes: 6437781064.534813
num_examples: 36468
- name: test
num_bytes: 804810899.2325933
num_examples: 4559
- name: validation
num_bytes: 804810899.2325933
num_examples: 4559
download_size: 8029293409
dataset_size: 8047402863.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Omni Sonus(All Speech) Dataset for speech related tasks
Multilingual speech dataset for multiple tasks including:
1. Speech Recognition.
2. Speech Synthesis.
3. Speech Emotion Recognition.
4. Speech Classification.
5. Speaker Classification.
6. Keyword Spotting.
7. Implementing new ideas.
## Dataset Details
Dataset Composition:
Encompasses a vast collection of audio recordings featuring both male and female speakers.
Each speaker contributes to the dataset across a range of emotions, ensuring diversity and comprehensiveness.
Professional speakers were chosen to provide a polished and clear representation of spoken text.
1. Languages and Accents:
Primarily focused on German and English accents in Version 1.0.
Future iterations planned to include a multitude of languages, with a special emphasis on Asian accents (Pakistani, Indian, Chinese) and the inclusion of Urdu language.
Aim to create a truly multilingual dataset to cater to a broader audience and enhance the model's adaptability.
2. Intention and Task Labeling:
The dataset is labeled based on the intention of the speaker, providing valuable insights into customer emotions during various tasks.
Intentions cover a spectrum of scenarios, including but not limited to customer service queries, informational requests, and emotional expressions.
3. Demographic Information:
Includes demographic details such as age and gender for each speaker.
Aims to capture a diverse representation of age groups and gender identities, contributing to a well-rounded and inclusive dataset.
4. Text Variation:
Each text in the dataset is spoken multiple times, ensuring robustness and variability in the training data.
This approach helps the model learn to recognize emotions and intentions across different instances of the same text.
5. Duration Range:
Spans a range of durations for each audio clip, mimicking real-world scenarios where interactions can vary in length.
Ensures that the model is adept at handling both short and extended conversational snippets.
6. Upcoming Enhancements:
Future versions are planned to feature an expanded range of accents, including but not limited to Urdu, and additional Asian accents.
Continuous updates to enrich the dataset and maintain its relevance in the ever-evolving landscape of language and communication.
This dataset serves as a robust resource for training models to understand and respond to human emotions, intentions, and accents, making it a valuable asset for applications ranging from customer service to emotional AI interfaces.
### Dataset Description
While the primary objective of this dataset lies in customer intention recognition, its versatility extends beyond the realm of customer
service applications.
This multilingual speech dataset holds immense potential for a diverse array of tasks, making it a valuable resource for various
applications in the field of natural language processing.
The dataset can be effectively utilized for tasks such as speech recognition, where the model can learn to transcribe spoken words
accurately.
Additionally, it is well-suited for speech synthesis, enabling the generation of natural-sounding and emotionally expressive synthetic
speech.
Speech emotion recognition benefits from the dataset's rich labeling of emotional states, contributing to the development of models that
can discern and respond to human emotions effectively.
Furthermore, the dataset supports speech classification and speaker classification tasks, offering a foundation for training models to
identify distinct speakers or classify spoken content.
It also facilitates keyword spotting, aiding in the identification of specific terms or phrases within spoken language.
Lastly, the dataset provides a robust platform for implementing new ideas, encouraging innovation and exploration within the domain of
multilingual speech processing.
Its adaptability across multiple tasks makes it a valuable asset for researchers and developers seeking a comprehensive and diverse speech
dataset.
### Dataset Sources [optional]
For now, this dataset is available on huggingface only but we aim to introduce the following sources soon:
- **Repository:** coming soon...
- **Paper [optional]:** coming soon...
- **Demo [optional]:** coming soon...
## Uses
Below are simplified code snippets using the datasets library in Python to load and use the described omni-sonus dataset.
For the sake of illustration, we assume that the dataset is available in the Hugging Face datasets hub.
## from datasets import load_dataset
## dataset = load_dataset("Hunzla/omnisonus")
You can use all the methods provided by datasets library.Please refer to the following documentation:
## https://huggingface.co/docs/datasets/index
And don't forget to update datasets library in case of errors.
## Dataset Structure
Dataset primarily consistys of the following columns:
1. file_name => This is a unique identifier of each audio with the 14 characters each with a specific meaning.
(i). First two digits represent an age of a speaker.
(ii). Third character represents gender of a speaker.m for male and f for female.
(iii). Next three characters from index 4 to 6 represent an emotion with following details:
"ang" => angry,
"bor" => bored,
"dis" => disgusting,
"anx" => anxiety/fear,
"hap" => happy,
"sad" => sadness,
"neu" => neutral/normal
(iv). Next 2 characters with index 7 and 8 togeather represent speaking language.
You can see language code character at https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes
(v). Finally last 6 characters from index 9 to 14 represent duration and unit of time measurement usually ms(milliseconds).
Example: "35fboren1960ms" <= Here this file_name is representing a 35 years old female speaker that is bored and speaking english language.
Additionally, the duration of of example audio is 1960 milliseconds.
2. audio => Representing an audio file.By default, on load_dataset("Hunzla/speech-commands-wav2vec2-960h") the resulting datasets will contain
an audio column containing an audio array and sampling rate with default value 16000.
3. text => This is transcription of an audio file that is being said by a speaker in audio file.
4. intention => Hypothetical column for a basic classification task to classifiy either customer is interested or not, assuming an audio
as a reponse by customer.
5. accent => This is reprecenting an accent of speaker.
## Terms and Conditions
This dataset is provided with the explicit understanding that it is intended solely for lawful and ethical purposes. Any use of this dataset for illegal, malicious, or unethical activities is strictly prohibited. By accessing or utilizing Omni-Sonus, you agree to adhere to the following guidelines:
1. Legal Compliance:
Omni-Sonus must not be used for any activities that violate local, national, or international laws. Users are expected to comply with all applicable regulations and statutes.
2. Ethical Use:
The dataset should be employed in a manner consistent with ethical standards and principles. Avoid any application that could cause harm, discomfort, or infringement upon the rights and privacy of individuals.
3. Non-Discrimination:
Ensure that the dataset is used without any form of discrimination, bias, or harm towards any individual or group based on factors such as race, gender, ethnicity, religion, or any other protected characteristics.
4. Privacy Protection:
Do not use Omni-Sonus in a way that compromises the privacy and confidentiality of individuals. Be cautious and responsible in handling any personally identifiable information that may be present in the dataset.
5. Intellectual Property Rights:
Respect and adhere to all intellectual property rights associated with the dataset. Unauthorized distribution, reproduction, or modification of the dataset is strictly prohibited.
6. Research and Educational Purposes:
While Omni-Sonus can be used for research and educational purposes, such activities should align with ethical standards and contribute positively to the advancement of knowledge.
7. No Unlawful Activities:
The dataset must not be utilized for any form of cybercrime, hacking, or other unlawful activities. Any attempt to compromise the integrity of systems or networks using Omni-Sonus is strictly forbidden.
Violation of these terms may result in legal consequences and the termination of access to the dataset. Users are urged to exercise responsible and ethical behavior when using Omni-Sonus and contribute positively to the development of technology and knowledge.
## Dataset Card Authors [optional]
- **Curated by:** Hunzla Usman & Syed Aun Zaidi.
- **Funded by [optional]:** Abacus Consulting (pvt) ltd.
- **Language(s) (NLP):** English (Multilingual speech(including Urdu) dataset will be released soon.)
## Dataset Card Contact
Email:
Syed Aun Zaidi => saunzaidi@gmail.com
Hunzla Usman => hunzlausman0000@gmail.com |
stevenc7/sdft_lr2hr | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 73545.0
num_examples: 5
download_size: 75595
dataset_size: 73545.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YaYaB/onepiece-blip-captions | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: 'One Piece BLIP captions'
size_categories:
- n<1K
source_datasets:
- YaYaB/onepiece-blip-captions
tags: []
task_categories:
- text-to-image
task_ids: []
---
# Disclaimer
This was inspired from https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions
# Dataset Card for One Piece BLIP captions
_Dataset used to train [One Piece text to image model](https://github.com/LambdaLabsML/examples/tree/main/stable-diffusion-finetuning)_
BLIP generated captions for One piece images collected from the web. Original images were obtained from [Anime Characters](https://www.animecharactersdatabase.com) and captioned with the [pre-trained BLIP model](https://github.com/salesforce/BLIP).
For each row the dataset contains `image` and `text` keys. `image` is a varying size PIL jpeg, and `text` is the accompanying text caption. Only a train split is provided.
## Examples

> a man in a straw hat

> a man in a green coat holding two swords

> a man with red hair and a black coat
## Citation
If you use this dataset, please cite it as:
```
@misc{yayab2022onepiece,
author = {YaYaB},
title = {One Piece BLIP captions},
year={2022},
howpublished= {\url{https://huggingface.co/datasets/YaYaB/onepiece-blip-captions/}}
}
``` |
Aryansoni27/Amitabh_bachchan_voice | ---
license: mit
---
|
CATIE-AQ/bisect_fr_prompt_textual_merging | ---
language:
- fr
license:
- cc-by-nc-4.0
size_categories:
- 10M<n<100M
task_categories:
- summarization
tags:
- textual-fusion
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- bisect
---
# bisect_fr_prompt_textual_merging
## Summary
**bisect_fr_prompt_textual_merging** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **10,383,891** rows that can be used for a textual fusion task.
The original data (without prompts) comes from the dataset [BiSECT](https://huggingface.co/datasets/GEM/BiSECT) by Kim et al. where only the French part has been kept.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Fusionner les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version fusionnée : ',
'Fusionne les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version fusionnée : ',
'Fusionnez les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version fusionnée : ',
'Combiner les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version combinée : ',
'Combine les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version combinée : ',
'Combinez les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version combinée : ',
'Réunir les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version réunie : ',
'Réunis les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version réunie : ',
'Réunissez les deux phrases suivantes en une seule tout en conservant leurs sens : "'+source+'" Version réunie : ',
'"'+source+' Fournir une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Fournis une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Fournissez une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Ecrire une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Ecris une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Ecrivez une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Rédiger une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Rédige une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Rédigez une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Générer une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Génère une version synonyme en une phrase des deux phrases précédentes : ',
'"'+source+' Générez une version synonyme en une phrase des deux phrases précédentes : '
```
### Features used in the prompts
In the prompt list above, `source` and `targets` have been constructed from:
```
bisect = load_dataset('GEM/BiSECT','fr')
source = bisect['train'][i]['target'].replace(' . ','. ').replace(' .','. ').replace(' , ',', ').replace(', ',', ').replace(' _SPLIT_','')[:-1]
targets = bisect['train'][i]['source'].replace(' . ','. ').replace(' .','. ').replace(' , ',', ').replace(', ',', ').replace('_SPLIT_','')[:-1]
```
# Splits
- `train` with 10,311,735 samples
- `valid` with 50,400 samples
- `test` with 21,756 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/bisect_fr_prompt_textual_merging")
```
# Citation
## Original data
> @inproceedings{bisect2021,
title={BiSECT: Learning to Split and Rephrase Sentences with Bitexts},
author={Kim, Joongwon and Maddela, Mounica and Kriz, Reno and Xu, Wei and Callison-Burch, Chris},
booktitle={Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP)},
year={2021}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
cc-by-nc-4.0 |
Dahoas/split_cot_gsm8k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: next_sent
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 33022180
num_examples: 26309
- name: test
num_bytes: 6252495
num_examples: 4909
- name: val
num_bytes: 1223547
num_examples: 957
download_size: 9385640
dataset_size: 40498222
---
# Dataset Card for "split_cot_gsm8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/const_dataset_2 | ---
dataset_info:
features:
- name: train
dtype: string
splits:
- name: train
num_bytes: 19352633
num_examples: 8153
download_size: 4941592
dataset_size: 19352633
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "const_dataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VerminRed/Cortex | ---
license: openrail
---
|
danaroth/washington_dc_mall | ---
license: unknown
---
# Description
This dataset contains airborne hyperspectral data flightline over the Washington DC Mall provided with the permission of Spectral Information Technology Application Center of Virginia who was responsible for its collection. The sensor system HYDICE used in this case measured pixel response in 210 bands in the 0.4 to 2.4 μm region of the visible and infrared spectrum. Bands in the 0.9 and 1.4 μm region where the atmosphere is opaque have been omitted from the data set, leaving 191 bands. The data set contains 1208 scan lines with 307 pixels in each scan line. It totals approximately 150 Megabytes.
# Characteristics
Washington DC Mall data set classes, labels and the number of samples.
| # | Class | Samples |
|---|----------------|---------|
| 1 | Roofs | 21419 |
| 2 | Street | 9834 |
| 3 | Grass | 22873 |
| 4 | Trees | 6882 |
| 5 | Path | 1105 |
| 6 | Water | 11063 |
| 7 | Shadow | 3061 |
# Quick look
<figure>
<img src= "assets/1771082.gif" alt="Washington DC Mall" width="300" />
<figcaption>Fake color visualization of the Washington DC Mall dataset, with bands 60, 27, 17 for red, green, blue respectively.</figcaption>
</figure>
<figure>
<img src= "assets/4264435.gif" alt="Indian Pines gt" width="300" />
<figcaption>Groundtruth of Washington DC Mall dataset.</figcaption>
</figure>
# Credits
Dataset originally available as part of the Multispec project at: https://engineering.purdue.edu/~biehl/MultiSpec/hyperspectral.html
Copyright (C) 1994-2020 Purdue Research Foundation.
Work leading to MultiSpec was funded in part by NASA Grants NAGW-925, NAGW-3924 and NAGW5-3975.
Supported by AmericaView (www.americaview.org)
The hyperspectral data set (dc.tif) of the Washington, DC mall area is provided with the permission of Spectral Information Technology Application Center of Virginia who was responsible for its collection.
|
Shuchen/codeparrot-valid | ---
license: apache-2.0
---
|
DianaJin/voice | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 31702768
num_examples: 33
- name: test
num_bytes: 4804440
num_examples: 5
- name: valid
num_bytes: 3843216
num_examples: 4
download_size: 14314631
dataset_size: 40350424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
emre/Open_SLR108_Turkish_10_hours | ---
license: cc-by-4.0
tags:
- robust-speech-event
datasets:
- MediaSpeech
---
MediaSpeech
Identifier: SLR108
Summary: French, Arabic, Turkish and Spanish media speech datasets
Category: Speech
License: dataset is distributed under the Creative Commons Attribution 4.0 International License.
About this resource:
MediaSpeech is a dataset of French, Arabic, Turkish and Spanish media speech built with the purpose of testing Automated Speech Recognition (ASR) systems performance. The dataset contains 10 hours of speech for each language provided.
The dataset consists of short speech segments automatically extracted from media videos available on YouTube and manually transcribed, with some pre- and post-processing.
Baseline models and wav version of the dataset can be found in the following git repository: https://github.com/NTRLab/MediaSpeech
@misc{mediaspeech2021,
title={MediaSpeech: Multilanguage ASR Benchmark and Dataset},
author={Rostislav Kolobov and Olga Okhapkina and Olga Omelchishina, Andrey Platunov and Roman Bedyakin and Vyacheslav Moshkin and Dmitry Menshikov and Nikolay Mikhaylovskiy},
year={2021},
eprint={2103.16193},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
|
davidgaofc/d_PriMa5_inout | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1207800
num_examples: 1820
download_size: 334761
dataset_size: 1207800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/28237_Intent_type_single_sentence_annotation_data | ---
license: cc-by-nc-nd-4.0
---
## Description
Intent-like single-sentence annotated textual data, the data size is 28,237 sentences, artificially written, and annotated with intent classes, including slot and slot value information; the intent field includes music, weather, date, schedule, home equipment, etc.; it is applied to intent recognition research and related fields.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1029?source=Huggingface
# Specifications
## Content
intent-type single sentence annotation data
## Label Content
Manually write sentences with corresponding intentions and make intent annotations
## Storage Format
Excel
## Language
Chinese
## Data Size
28,237Sentences
## Accuracy rate
95%
# Licensing Information
Commercial License
|
automated-research-group/llama2_7b_chat-arc_easy-results | ---
dataset_info:
config_name: '{''do_sample''=False, ''beams''=1}'
features:
- name: id
dtype: string
- name: prediction
dtype: string
- name: arc_challenge_accuracy
dtype: bool
splits:
- name: train
num_bytes: 135288
num_examples: 570
download_size: 71128
dataset_size: 135288
configs:
- config_name: '{''do_sample''=False, ''beams''=1}'
data_files:
- split: train
path: '{''do_sample''=False, ''beams''=1}/train-*'
---
|
netcat420/MHENN4 | ---
license: mit
---
|
Nicolas-BZRD/uld_loss_Llama-2-7b-chat-hf-squad | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: answers_generated
dtype: string
splits:
- name: train
num_bytes: 73788989
num_examples: 83214
- name: validation
num_bytes: 3870325
num_examples: 4380
download_size: 50155237
dataset_size: 77659314
---
# Dataset Card for "uld_loss_Llama-2-7b-chat-hf-squad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikesh66/Slang-Dataset | ---
language:
- en
size_categories:
- 1K<n<10K
---
# Slang Dataset
It contains artificially generated slang data along with their label
## Dataset Descripton:
- Number of Rows: 5,000
- Number of Columns: 2
- Column Names: 'Tweet', 'Sarcasm (yes/no)'
- Description: This dataset features tweets labeled for sarcasm. Each tweet is accompanied by a label ('yes' or 'no') indicating whether the tweet is sarcastic.
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_11_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 953
num_examples: 32
download_size: 2030
dataset_size: 953
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_11_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baicuya/images | ---
license: openrail
---
|
meliascosta/wiki_academic_subjects | ---
license: cc-by-3.0
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- crowdsourced
multilinguality:
- monolingual
paperswithcode_id: wikitext-2
pretty_name: Wikipedia Outline of Academic Disciplines
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- hierarchical
- academic
- tree
- dag
- topics
- subjects
task_categories:
- text-classification
task_ids:
- multi-label-classification
---
# Dataset Card for Wiki Academic Disciplines`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset was created from the [English wikipedia](https://meta.wikimedia.org/wiki/Data_dump_torrents#English_Wikipedia) dump of January 2022.
The main goal was to train a hierarchical classifier of academic subjects using [HiAGM](https://github.com/Alibaba-NLP/HiAGM).
### Supported Tasks and Leaderboard
Text classification - No leaderboard at the moment.
### Languages
English
## Dataset Structure
The dataset consists of groups of labeled text chunks (tokenized by spaces and with stopwords removed).
Labels are organized in a hieararchy (a DAG with a special Root node) of academic subjects.
Nodes correspond to entries in the [outline of academic disciplines](https://en.wikipedia.org/wiki/Outline_of_academic_disciplines) article from Wikipedia.
### Data Instances
Data is split in train/test/val each on a separate `.jsonl` file. Label hierarchy is listed a as TAB separated adjacency list on a `.taxonomy` file.
### Data Fields
JSONL files contain only two fields: a "token" field which holds the text tokens and a "label" field which holds a list of labels for that text.
### Data Splits
80/10/10 TRAIN/TEST/VAL schema
## Dataset Creation
All texts where extracted following the linked articles on [outline of academic disciplines](https://en.wikipedia.org/wiki/Outline_of_academic_disciplines)
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Wiki Dump
#### Who are the source language producers?
Wikipedia community.
### Annotations
#### Annotation process
Texts where automatically assigned to their linked academic discipline
#### Who are the annotators?
Wikipedia Community.
### Personal and Sensitive Information
All information is public.
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Creative Commons 3.0 (see [Wikipedia:Copyrights](https://en.wikipedia.org/wiki/Wikipedia:Copyrights))
### Citation Information
1. Zhou, Jie, et al. "Hierarchy-aware global model for hierarchical text classification." Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020.
### Contributions
Thanks to [@meliascosta](https://github.com/meliascosta) for adding this dataset.
|
niizam/brainly | ---
license: unlicense
task_categories:
- question-answering
language:
- id
---
### brainly.co.id dataset
### Data Structure
The keys in each JSONL object include:
- "id": An integer value representing the page of task from url (e.g. brainly.co.id/tugas/117).
- "subject": A string indicating the subject of the question (e.g., "Fisika", "Matematika", "Sejarah").
- "author": A string representing the author of the question.
- "instruction": A string providing the instruction or prompt for the question.
- "answerer_1", "answer_2": Strings representing the answerers for the question. The number at the end of the key (1 & 2) signifies the answer's index.
- "answer_1", "answer_2": Strings containing the answers provided by the answerers. The number at the end of the key corresponds to the answerer index.
- "status_1", "status_2": Strings indicating the status of the answers (e.g., "verified", "loved", "generic"). |
autoevaluate/autoeval-staging-eval-project-Blaise-g__SumPubmed-93d67e8f-12255639 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/SumPubmed
eval_info:
task: summarization
model: Blaise-g/long_t5_global_large_baseline_pubmed
metrics: []
dataset_name: Blaise-g/SumPubmed
dataset_config: Blaise-g--SumPubmed
dataset_split: test
col_mapping:
text: text
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/long_t5_global_large_baseline_pubmed
* Dataset: Blaise-g/SumPubmed
* Config: Blaise-g--SumPubmed
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
ruskape/Test | ---
license: openrail
---
|
plaguss/distilabel-sample-evol-instruct | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: instruction
sequence: string
splits:
- name: train
num_bytes: 29741
num_examples: 20
download_size: 16457
dataset_size: 29741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- distilabel
---
|
Mlxa/nested | ---
license: apache-2.0
---
|
Glac1er/gehshin | ---
license: unknown
---
|
open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B | ---
pretty_name: Evaluation run of Joseph717171/Genstruct-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Joseph717171/Genstruct-10.7B](https://huggingface.co/Joseph717171/Genstruct-10.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T15:50:47.030919](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B/blob/main/results_2024-03-30T15-50-47.030919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6058365742339414,\n\
\ \"acc_stderr\": 0.03286052164816604,\n \"acc_norm\": 0.6065695629397805,\n\
\ \"acc_norm_stderr\": 0.033526233034810754,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4666302750761303,\n\
\ \"mc2_stderr\": 0.015225617830989736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137996,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n\
\ \"acc_stderr\": 0.004767475366689767,\n \"acc_norm\": 0.8281218880701056,\n\
\ \"acc_norm_stderr\": 0.0037650342861534386\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201036,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560417,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560417\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333567,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.01498732543996355,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.01498732543996355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4028683181225554,\n\
\ \"acc_stderr\": 0.012526955577118016,\n \"acc_norm\": 0.4028683181225554,\n\
\ \"acc_norm_stderr\": 0.012526955577118016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.01975172650876264,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.01975172650876264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4666302750761303,\n\
\ \"mc2_stderr\": 0.015225617830989736\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729815\n }\n}\n```"
repo_url: https://huggingface.co/Joseph717171/Genstruct-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|winogrande|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T15-50-47.030919.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- results_2024-03-30T15-50-47.030919.parquet
- split: latest
path:
- results_2024-03-30T15-50-47.030919.parquet
---
# Dataset Card for Evaluation run of Joseph717171/Genstruct-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Joseph717171/Genstruct-10.7B](https://huggingface.co/Joseph717171/Genstruct-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T15:50:47.030919](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B/blob/main/results_2024-03-30T15-50-47.030919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6058365742339414,
"acc_stderr": 0.03286052164816604,
"acc_norm": 0.6065695629397805,
"acc_norm_stderr": 0.033526233034810754,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4666302750761303,
"mc2_stderr": 0.015225617830989736
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137996,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938213
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.004767475366689767,
"acc_norm": 0.8281218880701056,
"acc_norm_stderr": 0.0037650342861534386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201036,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560417,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560417
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333567,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996355,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4028683181225554,
"acc_stderr": 0.012526955577118016,
"acc_norm": 0.4028683181225554,
"acc_norm_stderr": 0.012526955577118016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.01975172650876264,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.01975172650876264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4666302750761303,
"mc2_stderr": 0.015225617830989736
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827934
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
santos-marco/QeA-GD-fine-tuning-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 251198
num_examples: 247
download_size: 64060
dataset_size: 251198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gabriel1322/jeimes | ---
license: openrail
---
|
akomma/uss-ratings-dataset | ---
license: mit
task_categories:
- text-classification
- zero-shot-classification
- conversational
language:
- en
pretty_name: uss-ratings-dataset
size_categories:
- 10K<n<100K
---
### Dataset Description
- **Homepage:** https://github.com/sunnweiwei/user-satisfaction-simulation
- **Repository:** https://github.com/sunnweiwei/user-satisfaction-simulation
- **Paper:** https://arxiv.org/pdf/2105.03748.pdf
- **View records using Datasette:** [datasette-link](https://lite.datasette.io/?parquet=https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fakomma%2Fuss-ratings-dataset%2Fresolve%2Fmain%2Fuss-ratings-dataset-datasette.parquet#/data/uss-ratings-dataset-datasette)
### Dataset Summary
- Dialogs Quality Dataset
- With both turn-level and dialog-level ratings provided on a scale of 1 to 5 by human annotators.
- Each task has been annotated by multiple annotators.
- Contains annotated dialogs from 4 different datasets (SGD, MultiWoz, ReDial, CCPE)
- Total 34358 turns from 3500 dialogs
|Dataset|Dialogs|Turns |
|-------|------:|-----:|
|SGD | 1000 | 11833|
|MWOZ | 1000 | 10553|
|Redial | 1000 | 6792 |
|CCPE | 500 | 5180 |
### Column Definitions
|Column |Type |Example Value |Description |
|-------------------|-------|-------------------------|-----------------------------------------------|
|split | str | CCPE;MWOZ;SGD;Redial | dataset name |
|session_idx | int | 1 | dialog identifier |
|turn_idx | int | 1 | turn identifier within a dialog |
|tree_idx | int | 1 | tree identifier within a turn (is all 1s here)|
|system | str | Do you like movies | system message |
|user | str | No I don't like | user message |
|turn_scores | list | [3; 2; 2] | list of turn-level quality scores from different human annotations|
|mean_turn_rating | float | 2.33 | mean of turn-level annotator scores |
|mode_turn_rating | int | 2 | mode of turn-level annotator scores |
|dialog_scores | list | [3; 3; 3] | list of dialog-level quality scores from different human annotations|
|mean_dialog_rating | float | 3.00 | mean of dialog-level annotator scores |
|mode_dialog_rating | int | 3 | mode of dialog-level annotator scores |
### Dataset Description
- **Homepage:** https://github.com/sunnweiwei/user-satisfaction-simulation
- **Repository:** https://github.com/sunnweiwei/user-satisfaction-simulation
- **Paper:** https://arxiv.org/pdf/2105.03748.pdf
- **View records using Datasette:** [datasette-link](https://lite.datasette.io/?parquet=https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fakomma%2Fuss-ratings-dataset%2Fresolve%2Fmain%2Fuss-ratings-dataset-datasette.parquet#/data/uss-ratings-dataset-datasette)
|
cakiki/dockerfile_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 36265516
num_examples: 1274173
download_size: 23300431
dataset_size: 36265516
---
# Dataset Card for "dockerfile_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jwestcott/fava-flagged-demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DoKoB/Pizza_Agent | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- en
size_categories:
- n<1K
--- |
distilled-one-sec-cv12-each-chunk-uniq/chunk_179 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1383027812.0
num_examples: 269491
download_size: 1414587782
dataset_size: 1383027812.0
---
# Dataset Card for "chunk_179"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sasha/prof_images_blip__22h-vintedois-diffusion-v0-1 | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: courier
num_bytes: 3640022.0
num_examples: 100
- name: aide
num_bytes: 3358153.0
num_examples: 100
- name: police_officer
num_bytes: 3522932.0
num_examples: 100
- name: purchasing_agent
num_bytes: 3286344.0
num_examples: 100
- name: metal_worker
num_bytes: 4410266.0
num_examples: 100
- name: financial_analyst
num_bytes: 3727701.0
num_examples: 100
- name: stocker
num_bytes: 3778322.0
num_examples: 100
- name: it_specialist
num_bytes: 4021431.0
num_examples: 100
- name: writer
num_bytes: 4150377.0
num_examples: 100
- name: accountant
num_bytes: 3206485.0
num_examples: 100
- name: coach
num_bytes: 3644886.0
num_examples: 100
- name: painter
num_bytes: 4259647.0
num_examples: 100
- name: real_estate_broker
num_bytes: 3439406.0
num_examples: 100
- name: truck_driver
num_bytes: 4438012.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 3900333.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3641931.0
num_examples: 100
- name: cook
num_bytes: 3467370.0
num_examples: 100
- name: interior_designer
num_bytes: 4011621.0
num_examples: 100
- name: nutritionist
num_bytes: 3657524.0
num_examples: 100
- name: designer
num_bytes: 3433880.0
num_examples: 100
- name: maid
num_bytes: 3236767.0
num_examples: 100
- name: producer
num_bytes: 3807892.0
num_examples: 100
- name: executive_assistant
num_bytes: 3199680.0
num_examples: 100
- name: logistician
num_bytes: 4051060.0
num_examples: 100
- name: tractor_operator
num_bytes: 5097668.0
num_examples: 100
- name: doctor
num_bytes: 3124348.0
num_examples: 100
- name: inventory_clerk
num_bytes: 3830045.0
num_examples: 100
- name: sheet_metal_worker
num_bytes: 4221678.0
num_examples: 100
- name: groundskeeper
num_bytes: 4363064.0
num_examples: 100
- name: electrical_engineer
num_bytes: 4562412.0
num_examples: 100
- name: physical_therapist
num_bytes: 3189145.0
num_examples: 100
- name: insurance_agent
num_bytes: 3040990.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 4278650.0
num_examples: 100
- name: psychologist
num_bytes: 3143650.0
num_examples: 100
- name: financial_advisor
num_bytes: 3196183.0
num_examples: 100
- name: printing_press_operator
num_bytes: 4494714.0
num_examples: 100
- name: architect
num_bytes: 3890945.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3079331.0
num_examples: 100
- name: artist
num_bytes: 4244089.0
num_examples: 100
- name: office_worker
num_bytes: 3462709.0
num_examples: 100
- name: ceo
num_bytes: 2997987.0
num_examples: 100
- name: taxi_driver
num_bytes: 4394782.0
num_examples: 100
- name: librarian
num_bytes: 3984923.0
num_examples: 100
- name: author
num_bytes: 3813285.0
num_examples: 100
- name: plumber
num_bytes: 4141970.0
num_examples: 100
- name: construction_worker
num_bytes: 3894234.0
num_examples: 100
- name: clergy
num_bytes: 3282554.0
num_examples: 100
- name: electrician
num_bytes: 4371703.0
num_examples: 100
- name: jailer
num_bytes: 4465435.0
num_examples: 100
- name: credit_counselor
num_bytes: 3139784.0
num_examples: 100
- name: scientist
num_bytes: 3489240.0
num_examples: 100
- name: drywall_installer
num_bytes: 3579519.0
num_examples: 100
- name: school_bus_driver
num_bytes: 4491302.0
num_examples: 100
- name: dental_assistant
num_bytes: 2979208.0
num_examples: 100
- name: fitness_instructor
num_bytes: 3536687.0
num_examples: 100
- name: detective
num_bytes: 3347937.0
num_examples: 100
- name: hairdresser
num_bytes: 3080679.0
num_examples: 100
- name: welder
num_bytes: 4766685.0
num_examples: 100
- name: pharmacy_technician
num_bytes: 4070319.0
num_examples: 100
- name: compliance_officer
num_bytes: 3471476.0
num_examples: 100
- name: singer
num_bytes: 3452677.0
num_examples: 100
- name: tutor
num_bytes: 3752207.0
num_examples: 100
- name: language_pathologist
num_bytes: 3587772.0
num_examples: 100
- name: medical_records_specialist
num_bytes: 3542194.0
num_examples: 100
- name: sales_manager
num_bytes: 2974259.0
num_examples: 100
- name: industrial_engineer
num_bytes: 4262086.0
num_examples: 100
- name: manager
num_bytes: 3106718.0
num_examples: 100
- name: mechanic
num_bytes: 4215182.0
num_examples: 100
- name: postal_worker
num_bytes: 4036524.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 3991071.0
num_examples: 100
- name: salesperson
num_bytes: 3111991.0
num_examples: 100
- name: office_clerk
num_bytes: 3599917.0
num_examples: 100
- name: claims_appraiser
num_bytes: 3836012.0
num_examples: 100
- name: security_guard
num_bytes: 3879575.0
num_examples: 100
- name: interviewer
num_bytes: 3016817.0
num_examples: 100
- name: dispatcher
num_bytes: 4344042.0
num_examples: 100
- name: lawyer
num_bytes: 3271761.0
num_examples: 100
- name: marketing_manager
num_bytes: 3238255.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3217003.0
num_examples: 100
- name: software_developer
num_bytes: 3364068.0
num_examples: 100
- name: mover
num_bytes: 3581012.0
num_examples: 100
- name: supervisor
num_bytes: 3452055.0
num_examples: 100
- name: paralegal
num_bytes: 3135751.0
num_examples: 100
- name: graphic_designer
num_bytes: 4484133.0
num_examples: 100
- name: dentist
num_bytes: 3104962.0
num_examples: 100
- name: roofer
num_bytes: 4264565.0
num_examples: 100
- name: public_relations_specialist
num_bytes: 3298954.0
num_examples: 100
- name: engineer
num_bytes: 3867898.0
num_examples: 100
- name: occupational_therapist
num_bytes: 3205012.0
num_examples: 100
- name: manicurist
num_bytes: 3112145.0
num_examples: 100
- name: cleaner
num_bytes: 3475664.0
num_examples: 100
- name: facilities_manager
num_bytes: 3562539.0
num_examples: 100
- name: repair_worker
num_bytes: 3906302.0
num_examples: 100
- name: cashier
num_bytes: 3728570.0
num_examples: 100
- name: baker
num_bytes: 3871443.0
num_examples: 100
- name: market_research_analyst
num_bytes: 3889616.0
num_examples: 100
- name: health_technician
num_bytes: 3356530.0
num_examples: 100
- name: veterinarian
num_bytes: 3249094.0
num_examples: 100
- name: underwriter
num_bytes: 3279381.0
num_examples: 100
- name: mechanical_engineer
num_bytes: 4452331.0
num_examples: 100
- name: janitor
num_bytes: 3733976.0
num_examples: 100
- name: pilot
num_bytes: 3918334.0
num_examples: 100
- name: therapist
num_bytes: 3077311.0
num_examples: 100
- name: director
num_bytes: 3305172.0
num_examples: 100
- name: wholesale_buyer
num_bytes: 3863590.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4217322.0
num_examples: 100
- name: butcher
num_bytes: 4351854.0
num_examples: 100
- name: machinery_mechanic
num_bytes: 4614179.0
num_examples: 100
- name: event_planner
num_bytes: 3547339.0
num_examples: 100
- name: carpet_installer
num_bytes: 4389212.0
num_examples: 100
- name: musician
num_bytes: 3639823.0
num_examples: 100
- name: civil_engineer
num_bytes: 3841611.0
num_examples: 100
- name: farmer
num_bytes: 4438706.0
num_examples: 100
- name: financial_manager
num_bytes: 3181723.0
num_examples: 100
- name: childcare_worker
num_bytes: 3586015.0
num_examples: 100
- name: clerk
num_bytes: 3213913.0
num_examples: 100
- name: machinist
num_bytes: 4295487.0
num_examples: 100
- name: firefighter
num_bytes: 4077232.0
num_examples: 100
- name: photographer
num_bytes: 3606746.0
num_examples: 100
- name: file_clerk
num_bytes: 4350476.0
num_examples: 100
- name: bus_driver
num_bytes: 4250786.0
num_examples: 100
- name: fast_food_worker
num_bytes: 3606432.0
num_examples: 100
- name: bartender
num_bytes: 4221598.0
num_examples: 100
- name: computer_programmer
num_bytes: 4180355.0
num_examples: 100
- name: pharmacist
num_bytes: 3996786.0
num_examples: 100
- name: nursing_assistant
num_bytes: 3158704.0
num_examples: 100
- name: career_counselor
num_bytes: 3415428.0
num_examples: 100
- name: mental_health_counselor
num_bytes: 3347482.0
num_examples: 100
- name: network_administrator
num_bytes: 4600993.0
num_examples: 100
- name: teacher
num_bytes: 3580451.0
num_examples: 100
- name: dishwasher
num_bytes: 4831099.0
num_examples: 100
- name: teller
num_bytes: 3422422.0
num_examples: 100
- name: teaching_assistant
num_bytes: 3551890.0
num_examples: 100
- name: payroll_clerk
num_bytes: 3292390.0
num_examples: 100
- name: laboratory_technician
num_bytes: 3898419.0
num_examples: 100
- name: social_assistant
num_bytes: 3222358.0
num_examples: 100
- name: radiologic_technician
num_bytes: 3937403.0
num_examples: 100
- name: social_worker
num_bytes: 3582335.0
num_examples: 100
- name: nurse
num_bytes: 3123385.0
num_examples: 100
- name: receptionist
num_bytes: 3372519.0
num_examples: 100
- name: carpenter
num_bytes: 4415058.0
num_examples: 100
- name: correctional_officer
num_bytes: 4070069.0
num_examples: 100
- name: community_manager
num_bytes: 3301952.0
num_examples: 100
- name: massage_therapist
num_bytes: 2954838.0
num_examples: 100
- name: head_cook
num_bytes: 3612046.0
num_examples: 100
- name: plane_mechanic
num_bytes: 3974652.0
num_examples: 100
download_size: 567913139
dataset_size: 544061331.0
---
# Dataset Card for "prof_images_blip__22h-vintedois-diffusion-v0-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaludi/data-food-classification | ---
task_categories:
- image-classification
---
# Dataset for project: food-classification
## Dataset Description
This dataset has been processed for project food-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<308x512 RGB PIL image>",
"target": 0
},
{
"image": "<512x512 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['apple_pie', 'falafel', 'french_toast', 'ice_cream', 'ramen', 'sushi', 'tiramisu'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1050 |
| valid | 350 |
|
fathyshalab/clinic-kitchen_and_dining | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 66661.34844444445
num_examples: 787
- name: test
num_bytes: 28629.651555555556
num_examples: 338
download_size: 0
dataset_size: 95291.0
---
# Dataset Card for "clinic-kitchen_and_dining"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChaiML/20240222_chai_prize_reward_model_data | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: labels
dtype: int64
- name: season
dtype: string
splits:
- name: train
num_bytes: 10206216
num_examples: 5164
download_size: 5776483
dataset_size: 10206216
---
# Dataset Card for "20240222_chai_prize_reward_model_data"
Chai Prize Reward Dataset now includes double thumbs up!
**labels:**
- 0: thumbs down
- 1: thumbs up
- 2: double thumbs up |
xjyplayer/sweet_li | ---
license: apache-2.0
---
|
mnaguib/WikiNER | ---
configs:
- config_name: en
data_files:
- split: train
path: "data/en/train.parquet"
- split: test
path: "data/en/test.parquet"
- config_name: fr
data_files:
- split: train
path: "data/fr/train.parquet"
- split: test
path: "data/fr/test.parquet"
- config_name: es
data_files:
- split: train
path: "data/es/train.parquet"
- split: test
path: "data/es/test.parquet"
- config_name: de
data_files:
- split: train
path: "data/de/train.parquet"
- split: test
path: "data/de/test.parquet"
- config_name: it
data_files:
- split: train
path: "data/it/train.parquet"
- split: test
path: "data/it/test.parquet"
- config_name: ru
data_files:
- split: train
path: "data/ru/train.parquet"
- split: test
path: "data/ru/test.parquet"
- config_name: pl
data_files:
- split: train
path: "data/pl/train.parquet"
- split: test
path: "data/pl/test.parquet"
- config_name: pt
data_files:
- split: train
path: "data/pt/train.parquet"
- split: test
path: "data/pt/test.parquet"
---
WikiNER is a multilingual silver-standard annotated NER dataset. It consists in a late-2010 snapshot of Wikipedia in nine languages. Hyperlinks referring to persons, locations or organizations were automatically annotated.

```
@Article{nothman2012:artint:wikiner,
author = {Joel Nothman and Nicky Ringland and Will Radford and Tara Murphy and James R. Curran},
title = {Learning multilingual named entity recognition from {Wikipedia}},
journal = {Artificial Intelligence},
publisher = {Elsevier},
volume = {194},
pages = {151--175},
year = {2012},
doi = {10.1016/j.artint.2012.03.006},
url = {http://dx.doi.org/10.1016/j.artint.2012.03.006}
}
``` |
Tongjilibo/THUCNews | ---
license: apache-2.0
---
# 一、数据集
| 数据集名称 | 用途 | 下载链接 |
| ---------------- | -------------------- | --------------------------------------------------------------------------------------------------------------------------------- |
| THUCNews | 文本分类、文本生成 | [THUCNews](http://thuctc.thunlp.org/#%E4%B8%AD%E6%96%87%E6%96%87%E6%9C%AC%E5%88%86%E7%B1%BB%E6%95%B0%E6%8D%AE%E9%9B%86THUCNews) |
- 由于原文件碎文件很多,这里提供脚本合并为多个单文件 |
usmiva/bg_ner_bsnlp | ---
license: apache-2.0
task_categories:
- token-classification
language:
- bg
---
# Dataset Card for Bulgarian Named Entity Recognition. Initial dataset is taken from Balto-Slavic NLP shared task and is further transformed in the format appropriate for token classification. The instances are randomized and splitted into train and test splits.
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is initially created for the BSNLP Shared Task 2019 and reported in the conference paper "The Second Cross-Lingual Challenge on Recognition, Normalization, Classification, and Linking of Named Entities across Slavic Languages"
It is further improved in "Reconstructing NER Corpora: a Case Study on Bulgarian" and finally transformed in a csv format appropriate for token classification in Huggingface.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
train, test
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
@inproceedings{piskorski-etal-2019-second,
title = "The Second Cross-Lingual Challenge on Recognition, Normalization, Classification, and Linking of Named Entities across {S}lavic Languages",
author = "Piskorski, Jakub and Laskova, Laska and Marci{\'n}czuk, Micha{\l} and Pivovarova, Lidia and P{\v{r}}ib{\'a}{\v{n}}, Pavel
and Steinberger, Josef and Yangarber, Roman",
booktitle = "Proceedings of the 7th Workshop on Balto-Slavic Natural Language Processing",
month = aug,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/W19-3709",
pages = "63--74"
}
@inproceedings{marinova-etal-2020-reconstructing,
title = "Reconstructing {NER} Corpora: a Case Study on {B}ulgarian",
author = "Marinova, Iva and
Laskova, Laska and
Osenova, Petya and
Simov, Kiril and
Popov, Alexander",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.571",
pages = "4647--4652",
abstract = "The paper reports on the usage of deep learning methods for improving a Named Entity Recognition (NER) training corpus and for predicting and annotating new types in a test corpus. We show how the annotations in a type-based corpus of named entities (NE) were populated as occurrences within it, thus ensuring density of the training information. A deep learning model was adopted for discovering inconsistencies in the initial annotation and for learning new NE types. The evaluation results get improved after data curation, randomization and deduplication.",
language = "English",
ISBN = "979-10-95546-34-4",
}
### Contributions
[More Information Needed] |
Zhuoran918/Donut_v1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 27724451.0
num_examples: 76
- name: test
num_bytes: 1745330.0
num_examples: 5
- name: validation
num_bytes: 3759644.0
num_examples: 9
download_size: 25491818
dataset_size: 33229425.0
---
# Dataset Card for "Donut_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_SA_ESG_100_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 909868
num_examples: 100
download_size: 502551
dataset_size: 909868
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Face_Recognition_Data_with_Gauze_Mask | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Face_Recognition_Data_with_Gauze_Mask
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1084?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
5,030 People - Face Recognition Data with Gauze Mask, for each subject, 7 images were collected. The dataset diversity includes multiple mask types, multiple ages, multiple light conditions and scenes.This data can be applied to computer vision tasks such as occluded face detection and recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1084?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
Back-up/stock-f319 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 538219641
num_examples: 113262
download_size: 190178900
dataset_size: 538219641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AbderrahmanSkiredj1/quran_by_sura_by_aya_range | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 34991900
num_examples: 10842
download_size: 15025773
dataset_size: 34991900
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tr416/dataset_20231006_190920 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74071
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_190920"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jinyan1/GossipCop | ---
configs:
- config_name: default
data_files:
- split: MF
path: data/MF-*
- split: HF
path: data/HF-*
- split: MR
path: data/MR-*
- split: HR
path: data/HR-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: description
dtype: string
splits:
- name: MF
num_bytes: 6445810
num_examples: 4084
- name: HF
num_bytes: 12350244
num_examples: 4084
- name: MR
num_bytes: 10848721
num_examples: 4169
- name: HR
num_bytes: 27606118
num_examples: 8168
download_size: 35223867
dataset_size: 57250893
---
# Dataset Card for "GossipCop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Fredithefish__CrimsonPajama | ---
pretty_name: Evaluation run of Fredithefish/CrimsonPajama
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/CrimsonPajama](https://huggingface.co/Fredithefish/CrimsonPajama)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__CrimsonPajama\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T20:55:57.055960](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__CrimsonPajama/blob/main/results_2023-10-17T20-55-57.055960.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n\
\ \"em_stderr\": 0.0008164468837432291,\n \"f1\": 0.08161598154362382,\n\
\ \"f1_stderr\": 0.0017802453361789499,\n \"acc\": 0.3286203762267581,\n\
\ \"acc_stderr\": 0.007694655126017044\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432291,\n\
\ \"f1\": 0.08161598154362382,\n \"f1_stderr\": 0.0017802453361789499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948034\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6519337016574586,\n \"acc_stderr\": 0.013388004531086054\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/CrimsonPajama
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T20_55_57.055960
path:
- '**/details_harness|drop|3_2023-10-17T20-55-57.055960.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T20-55-57.055960.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T20_55_57.055960
path:
- '**/details_harness|gsm8k|5_2023-10-17T20-55-57.055960.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T20-55-57.055960.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:19:26.317110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:19:26.317110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T20_55_57.055960
path:
- '**/details_harness|winogrande|5_2023-10-17T20-55-57.055960.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T20-55-57.055960.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_19_26.317110
path:
- results_2023-07-19T19:19:26.317110.parquet
- split: 2023_10_17T20_55_57.055960
path:
- results_2023-10-17T20-55-57.055960.parquet
- split: latest
path:
- results_2023-10-17T20-55-57.055960.parquet
---
# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/CrimsonPajama
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/CrimsonPajama](https://huggingface.co/Fredithefish/CrimsonPajama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__CrimsonPajama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T20:55:57.055960](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__CrimsonPajama/blob/main/results_2023-10-17T20-55-57.055960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432291,
"f1": 0.08161598154362382,
"f1_stderr": 0.0017802453361789499,
"acc": 0.3286203762267581,
"acc_stderr": 0.007694655126017044
},
"harness|drop|3": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432291,
"f1": 0.08161598154362382,
"f1_stderr": 0.0017802453361789499
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948034
},
"harness|winogrande|5": {
"acc": 0.6519337016574586,
"acc_stderr": 0.013388004531086054
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sombuck/sample | ---
license: gpl-3.0
task_categories:
- image-classification
task_ids:
- multi-label-image-classification
language:
- en
pretty_name: sample dataset
---
# Dataset Card for Architectural Multi-Label Classification
This dataset is a sample dataset, using images randomly chosen from the internet, used to demonstrate using Huggingface for AEC datasets. |
unsloth/notebooks | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall | ---
pretty_name: Evaluation run of Lvxy1117/amber_fine_tune_sgall
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lvxy1117/amber_fine_tune_sgall](https://huggingface.co/Lvxy1117/amber_fine_tune_sgall)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T04:49:00.115070](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall/blob/main/results_2024-02-14T04-49-00.115070.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.32013909097742504,\n\
\ \"acc_stderr\": 0.032715298552530025,\n \"acc_norm\": 0.3224755631334577,\n\
\ \"acc_norm_stderr\": 0.03349829796565176,\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.4047870475782831,\n\
\ \"mc2_stderr\": 0.014878403265738149\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40955631399317405,\n \"acc_stderr\": 0.014370358632472446,\n\
\ \"acc_norm\": 0.44283276450511944,\n \"acc_norm_stderr\": 0.014515573873348902\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5653256323441546,\n\
\ \"acc_stderr\": 0.004947010937455345,\n \"acc_norm\": 0.7476598287193786,\n\
\ \"acc_norm_stderr\": 0.004334676952703862\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438648,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438648\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.039215453124671215,\n\
\ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.039215453124671215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.02218203720294836,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.02218203720294836\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411894,\n \"\
acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411894\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"\
acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193337,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193337\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204423,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204423\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3174311926605505,\n \"acc_stderr\": 0.0199571521984605,\n \"acc_norm\"\
: 0.3174311926605505,\n \"acc_norm_stderr\": 0.0199571521984605\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.17592592592592593,\n\
\ \"acc_stderr\": 0.025967420958258533,\n \"acc_norm\": 0.17592592592592593,\n\
\ \"acc_norm_stderr\": 0.025967420958258533\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.3284313725490196,\n \"acc_stderr\": 0.03296245110172229,\n\
\ \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.03296245110172229\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.35443037974683544,\n \"acc_stderr\": 0.031137304297185805,\n \
\ \"acc_norm\": 0.35443037974683544,\n \"acc_norm_stderr\": 0.031137304297185805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.43946188340807174,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.43946188340807174,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.42735042735042733,\n\
\ \"acc_stderr\": 0.03240847393516326,\n \"acc_norm\": 0.42735042735042733,\n\
\ \"acc_norm_stderr\": 0.03240847393516326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.01757070523925654,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.01757070523925654\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.33236994219653176,\n \"acc_stderr\": 0.025361168749688228,\n\
\ \"acc_norm\": 0.33236994219653176,\n \"acc_norm_stderr\": 0.025361168749688228\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.01414957534897627,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.01414957534897627\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615686,\n\
\ \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.38271604938271603,\n \"acc_stderr\": 0.027044538138402605,\n\
\ \"acc_norm\": 0.38271604938271603,\n \"acc_norm_stderr\": 0.027044538138402605\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2653194263363755,\n\
\ \"acc_stderr\": 0.011276198843958878,\n \"acc_norm\": 0.2653194263363755,\n\
\ \"acc_norm_stderr\": 0.011276198843958878\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.0276784686421447,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.0276784686421447\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3235294117647059,\n \"acc_stderr\": 0.018926082916083393,\n \
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.018926082916083393\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538422,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538422\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.35323383084577115,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.35323383084577115,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4502923976608187,\n \"acc_stderr\": 0.03815827365913235,\n\
\ \"acc_norm\": 0.4502923976608187,\n \"acc_norm_stderr\": 0.03815827365913235\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.4047870475782831,\n\
\ \"mc2_stderr\": 0.014878403265738149\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6748224151539068,\n \"acc_stderr\": 0.013165525471764361\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \
\ \"acc_stderr\": 0.005600987515237868\n }\n}\n```"
repo_url: https://huggingface.co/Lvxy1117/amber_fine_tune_sgall
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|arc:challenge|25_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|gsm8k|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hellaswag|10_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T04-49-00.115070.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- '**/details_harness|winogrande|5_2024-02-14T04-49-00.115070.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T04-49-00.115070.parquet'
- config_name: results
data_files:
- split: 2024_02_14T04_49_00.115070
path:
- results_2024-02-14T04-49-00.115070.parquet
- split: latest
path:
- results_2024-02-14T04-49-00.115070.parquet
---
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_sgall](https://huggingface.co/Lvxy1117/amber_fine_tune_sgall) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T04:49:00.115070](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall/blob/main/results_2024-02-14T04-49-00.115070.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.32013909097742504,
"acc_stderr": 0.032715298552530025,
"acc_norm": 0.3224755631334577,
"acc_norm_stderr": 0.03349829796565176,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.4047870475782831,
"mc2_stderr": 0.014878403265738149
},
"harness|arc:challenge|25": {
"acc": 0.40955631399317405,
"acc_stderr": 0.014370358632472446,
"acc_norm": 0.44283276450511944,
"acc_norm_stderr": 0.014515573873348902
},
"harness|hellaswag|10": {
"acc": 0.5653256323441546,
"acc_stderr": 0.004947010937455345,
"acc_norm": 0.7476598287193786,
"acc_norm_stderr": 0.004334676952703862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438648,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438648
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.02218203720294836,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.02218203720294836
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193337,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193337
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204423,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204423
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3174311926605505,
"acc_stderr": 0.0199571521984605,
"acc_norm": 0.3174311926605505,
"acc_norm_stderr": 0.0199571521984605
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.35443037974683544,
"acc_stderr": 0.031137304297185805,
"acc_norm": 0.35443037974683544,
"acc_norm_stderr": 0.031137304297185805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.43946188340807174,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.43946188340807174,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.42735042735042733,
"acc_stderr": 0.03240847393516326,
"acc_norm": 0.42735042735042733,
"acc_norm_stderr": 0.03240847393516326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.01757070523925654,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.01757070523925654
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.33236994219653176,
"acc_stderr": 0.025361168749688228,
"acc_norm": 0.33236994219653176,
"acc_norm_stderr": 0.025361168749688228
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.01414957534897627,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.01414957534897627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615686,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.38271604938271603,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.38271604938271603,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2653194263363755,
"acc_stderr": 0.011276198843958878,
"acc_norm": 0.2653194263363755,
"acc_norm_stderr": 0.011276198843958878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.0276784686421447,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.0276784686421447
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.018926082916083393,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.018926082916083393
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538422,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538422
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.35323383084577115,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.35323383084577115,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4502923976608187,
"acc_stderr": 0.03815827365913235,
"acc_norm": 0.4502923976608187,
"acc_norm_stderr": 0.03815827365913235
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.4047870475782831,
"mc2_stderr": 0.014878403265738149
},
"harness|winogrande|5": {
"acc": 0.6748224151539068,
"acc_stderr": 0.013165525471764361
},
"harness|gsm8k|5": {
"acc": 0.043214556482183475,
"acc_stderr": 0.005600987515237868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
seungheondoh/multimodal_msd | ---
dataset_info:
features:
- name: msd_track_id
dtype: string
- name: shazam_id
dtype: string
- name: youtube_url
dtype: string
- name: score
dtype: float64
- name: msd_title
dtype: string
- name: shazam_title
dtype: string
- name: msd_artist
dtype: string
- name: shazam_artist
dtype: string
- name: msd_album
dtype: string
- name: shazam_album
dtype: string
- name: msd_year
dtype: int64
- name: shazam_year
dtype: string
- name: msd_song_id
dtype: string
- name: msd_artist_id
dtype: string
- name: msd_artist_mbid
dtype: string
- name: shazam_label
dtype: string
- name: shazam_album_art_img_url
dtype: string
- name: shazam_artist_img_url
dtype: string
- name: path
dtype: string
splits:
- name: train
num_bytes: 521905013
num_examples: 899118
download_size: 317981770
dataset_size: 521905013
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RamAnanth1/lex-fridman-podcasts | ---
lexicap:
- found
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: 'Lex Fridman Podcasts '
size_categories:
- n<1K
task_categories:
- text-classification
- text-generation
- summarization
task_ids:
- sentiment-analysis
- dialogue-modeling
- language-modeling
---
# Dataset Card for Lex Fridman Podcasts Dataset
This dataset is sourced from Andrej Karpathy's [Lexicap website](https://karpathy.ai/lexicap/) which contains English transcripts of Lex Fridman's wonderful podcast episodes. The transcripts were generated using OpenAI's large-sized [Whisper model]("https://github.com/openai/whisper") |
harshnarayan12/flat | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Kiddyz__testlm-1 | ---
pretty_name: Evaluation run of Kiddyz/testlm-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kiddyz/testlm-1](https://huggingface.co/Kiddyz/testlm-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm-1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-16T12:53:22.897812](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1/blob/main/results_2023-08-16T12%3A53%3A22.897812.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5128834307003443,\n\
\ \"acc_stderr\": 0.03501260490290392,\n \"acc_norm\": 0.5166256154161327,\n\
\ \"acc_norm_stderr\": 0.03500071412093006,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056992,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.014575583922019669\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5705038836885082,\n\
\ \"acc_stderr\": 0.004939925958728884,\n \"acc_norm\": 0.758016331408086,\n\
\ \"acc_norm_stderr\": 0.004274091605308121\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750573,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750573\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347368,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347368\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412202,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412202\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399813,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399813\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29720670391061454,\n\
\ \"acc_stderr\": 0.015285313353641602,\n \"acc_norm\": 0.29720670391061454,\n\
\ \"acc_norm_stderr\": 0.015285313353641602\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3754889178617992,\n\
\ \"acc_stderr\": 0.012367945396728208,\n \"acc_norm\": 0.3754889178617992,\n\
\ \"acc_norm_stderr\": 0.012367945396728208\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150124,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.48413168566081527,\n\
\ \"mc2_stderr\": 0.015167638286466481\n }\n}\n```"
repo_url: https://huggingface.co/Kiddyz/testlm-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:53:22.897812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:53:22.897812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:53:22.897812.parquet'
- config_name: results
data_files:
- split: 2023_08_16T12_53_22.897812
path:
- results_2023-08-16T12:53:22.897812.parquet
- split: latest
path:
- results_2023-08-16T12:53:22.897812.parquet
---
# Dataset Card for Evaluation run of Kiddyz/testlm-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kiddyz/testlm-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kiddyz/testlm-1](https://huggingface.co/Kiddyz/testlm-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm-1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-16T12:53:22.897812](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-1/blob/main/results_2023-08-16T12%3A53%3A22.897812.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5128834307003443,
"acc_stderr": 0.03501260490290392,
"acc_norm": 0.5166256154161327,
"acc_norm_stderr": 0.03500071412093006,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056992,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.014575583922019669
},
"harness|hellaswag|10": {
"acc": 0.5705038836885082,
"acc_stderr": 0.004939925958728884,
"acc_norm": 0.758016331408086,
"acc_norm_stderr": 0.004274091605308121
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750573,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750573
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347368,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347368
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412202,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412202
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399813,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399813
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.0162460870697014,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.0162460870697014
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29720670391061454,
"acc_stderr": 0.015285313353641602,
"acc_norm": 0.29720670391061454,
"acc_norm_stderr": 0.015285313353641602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3754889178617992,
"acc_stderr": 0.012367945396728208,
"acc_norm": 0.3754889178617992,
"acc_norm_stderr": 0.012367945396728208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150124,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.48413168566081527,
"mc2_stderr": 0.015167638286466481
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
positivethoughts/rewrite_2.1k | ---
pretty_name: c
---
rewrite 2.1k
Essays rewritten by gemma-7b-it on A100 in bfloat16 using TGI
The original essays were taken from https://huggingface.co/datasets/euclaise/writingprompts, which is from Reddit.
V1
2.1k essays. Prompts created using chatGPT. There are about 100 different prompts, so each prompt was used multiple times.
https://www.kaggle.com/datasets/nbroad/gemma-rewrite-nbroad |
jeapaul/europarl_bilingual_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 281100121
num_examples: 1892723
download_size: 155904108
dataset_size: 281100121
---
# Dataset Card for "europarl_bilingual_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train50_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 12528
num_examples: 140
- name: validation
num_bytes: 4600
num_examples: 40
download_size: 11793
dataset_size: 17128
---
# Dataset Card for "random_letter_find_passage_train50_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
louisbrulenaudet/code-rural-ancien | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code rural (ancien)
source_datasets:
- original
pretty_name: Code rural (ancien)
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code rural (ancien), non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
AwesomeEmerald/OpenNaturalConvo | ---
license: mit
---
|
jamescalam/youtube-transcriptions | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- afl-3.0
multilinguality:
- monolingual
pretty_name: Youtube Transcriptions
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- youtube
- technical
- speech to text
- speech
- video
- video search
- audio
- audio search
task_categories:
- conversational
- question-answering
- text-retrieval
- visual-question-answering
task_ids:
- open-domain-qa
- extractive-qa
- document-retrieval
- visual-question-answering
---
The YouTube transcriptions dataset contains technical tutorials (currently from [James Briggs](https://www.youtube.com/c/jamesbriggs), [Daniel Bourke](https://www.youtube.com/channel/UCr8O8l5cCX85Oem1d18EezQ), and [AI Coffee Break](https://www.youtube.com/c/aicoffeebreak)) transcribed using [OpenAI's Whisper](https://huggingface.co/openai/whisper-large) (large). Each row represents roughly a sentence-length chunk of text alongside the video URL and timestamp.
Note that each item in the dataset contains just a short chunk of text. For most use cases you will likely need to merge multiple rows to create more substantial chunks of text, if you need to do that, this code snippet will help:
```python
from datasets import load_dataset
# first download the dataset
data = load_dataset(
'jamescalam/youtube-transcriptions',
split='train'
)
new_data = [] # this will store adjusted data
window = 6 # number of sentences to combine
stride = 3 # number of sentences to 'stride' over, used to create overlap
for i in range(0, len(data), stride):
i_end = min(len(data)-1, i+window)
if data[i]['title'] != data[i_end]['title']:
# in this case we skip this entry as we have start/end of two videos
continue
# create larger text chunk
text = ' '.join(data[i:i_end]['text'])
# add to adjusted data list
new_data.append({
'start': data[i]['start'],
'end': data[i_end]['end'],
'title': data[i]['title'],
'text': text,
'id': data[i]['id'],
'url': data[i]['url'],
'published': data[i]['published']
})
``` |
dmayhem93/self-critiquing-critique | ---
dataset_info:
features:
- name: id
dtype: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: time
dtype: float64
- name: labeler
dtype: string
- name: is_topic_based_summarization
dtype: bool
- name: category
dtype: string
- name: severity
dtype: int64
- name: text_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: response_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 262218653
num_examples: 61503
- name: test
num_bytes: 43153769
num_examples: 9437
download_size: 36446351
dataset_size: 305372422
---
# Dataset Card for "self-critiquing-critique"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x70DA/ms_marco_clean | ---
dataset_info:
features:
- name: query
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: validation
num_bytes: 25195165.999419145
num_examples: 7059
- name: train
num_bytes: 208554643.03976434
num_examples: 58192
- name: test
num_bytes: 24321156.439980637
num_examples: 6814
download_size: 135125755
dataset_size: 258070965.47916412
---
# Dataset Card for "ms_marco_clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/rurounikenshin2023 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Rurouni Kenshin (2023)
This is the image base of bangumi Rurouni Kenshin (2023), we detected 71 characters, 9015 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 619 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 14 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 64 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 164 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 63 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 109 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 293 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 54 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 91 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 174 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 19 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 36 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 43 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 89 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 22 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 215 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 16 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 919 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 61 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 1104 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 21 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 888 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 106 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 38 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 368 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 22 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 29 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 28 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 27 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 70 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 46 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 68 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 19 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 1733 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 144 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 26 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 59 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 49 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 13 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 27 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 137 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 33 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 61 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 60 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 26 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 49 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 13 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 30 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 19 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 44 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 16 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 20 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 22 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 15 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 139 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 14 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 64 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 24 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 23 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 10 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 10 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 8 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 9 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 6 | [Download](66/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 67 | 51 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 21 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 7 | [Download](69/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 77 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
one-sec-cv12/chunk_144 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20525841792.0
num_examples: 213704
download_size: 18760055690
dataset_size: 20525841792.0
---
# Dataset Card for "chunk_144"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wid4soe/182_simpsons_train | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: new_image
dtype: image
splits:
- name: train
num_bytes: 15305372.0
num_examples: 550
- name: valid
num_bytes: 1663647.0
num_examples: 54
- name: test
num_bytes: 4341184.0
num_examples: 151
download_size: 21260107
dataset_size: 21310203.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Abdou/dz-sentiment-yt-comments | ---
license: mit
task_categories:
- text-classification
language:
- ar
size_categories:
- 10K<n<100K
---
# A Sentiment Analysis Dataset for the Algerian Dialect of Arabic
This dataset consists of 50,016 samples of comments extracted from Algerian YouTube channels. It is manually annotated with 3 classes (the `label` column) and is not balanced. Here are the number of rows of each class:
- 0 (Negative): **17,033 (34.06%)**
- 1 (Neutral): **11,136 (22.26%)**
- 2 (Positive): **21,847 (43.68%)**
Please note that there are some swear words in the dataset, so please use it with caution.
# Citation
If you find our work useful, please cite it as follows:
```bibtex
@article{2023,
title={Sentiment Analysis on Algerian Dialect with Transformers},
author={Zakaria Benmounah and Abdennour Boulesnane and Abdeladim Fadheli and Mustapha Khial},
journal={Applied Sciences},
volume={13},
number={20},
pages={11157},
year={2023},
month={Oct},
publisher={MDPI AG},
DOI={10.3390/app132011157},
ISSN={2076-3417},
url={http://dx.doi.org/10.3390/app132011157}
}
```
|
open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b-chat-pyg | ---
pretty_name: Evaluation run of openaccess-ai-collective/manticore-13b-chat-pyg
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/manticore-13b-chat-pyg](https://huggingface.co/openaccess-ai-collective/manticore-13b-chat-pyg)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b-chat-pyg\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:58:22.598379](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b-chat-pyg/blob/main/results_2023-09-23T08-58-22.598379.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02925755033557047,\n\
\ \"em_stderr\": 0.0017258801842771152,\n \"f1\": 0.09186136744966467,\n\
\ \"f1_stderr\": 0.0021533865918944134,\n \"acc\": 0.4337145226735951,\n\
\ \"acc_stderr\": 0.009944810794409672\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02925755033557047,\n \"em_stderr\": 0.0017258801842771152,\n\
\ \"f1\": 0.09186136744966467,\n \"f1_stderr\": 0.0021533865918944134\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09552691432903715,\n \
\ \"acc_stderr\": 0.008096605771155745\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.0117930158176636\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/manticore-13b-chat-pyg
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_58_22.598379
path:
- '**/details_harness|drop|3_2023-09-23T08-58-22.598379.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-58-22.598379.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_58_22.598379
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-58-22.598379.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-58-22.598379.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_58_22.598379
path:
- '**/details_harness|winogrande|5_2023-09-23T08-58-22.598379.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-58-22.598379.parquet'
- config_name: results
data_files:
- split: 2023_09_23T08_58_22.598379
path:
- results_2023-09-23T08-58-22.598379.parquet
- split: latest
path:
- results_2023-09-23T08-58-22.598379.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b-chat-pyg
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/manticore-13b-chat-pyg
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-13b-chat-pyg](https://huggingface.co/openaccess-ai-collective/manticore-13b-chat-pyg) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b-chat-pyg",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:58:22.598379](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b-chat-pyg/blob/main/results_2023-09-23T08-58-22.598379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02925755033557047,
"em_stderr": 0.0017258801842771152,
"f1": 0.09186136744966467,
"f1_stderr": 0.0021533865918944134,
"acc": 0.4337145226735951,
"acc_stderr": 0.009944810794409672
},
"harness|drop|3": {
"em": 0.02925755033557047,
"em_stderr": 0.0017258801842771152,
"f1": 0.09186136744966467,
"f1_stderr": 0.0021533865918944134
},
"harness|gsm8k|5": {
"acc": 0.09552691432903715,
"acc_stderr": 0.008096605771155745
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.0117930158176636
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
enoahjr/twitter_dataset_1713169385 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 139995
num_examples: 398
download_size: 79551
dataset_size: 139995
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adabingw/lyrr-taylorswift | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1899803
num_examples: 991
download_size: 860281
dataset_size: 1899803
---
# Dataset Card for "lyrr-taylorswift"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.