sha stringlengths 40 40 | text stringlengths 1 13.4M | id stringlengths 2 117 | tags listlengths 1 7.91k | created_at stringlengths 25 25 | metadata stringlengths 2 875k | last_modified stringlengths 25 25 | arxiv listlengths 0 25 | languages listlengths 0 7.91k | tags_str stringlengths 17 159k | text_str stringlengths 1 447k | text_lists listlengths 0 352 | processed_texts listlengths 1 353 | tokens_length listlengths 1 353 | input_texts listlengths 1 40 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e3c2578daebbd774e98ff03ad79fdcc301a875ea |
# Dataset Card for Evaluation run of Delcos/Starling-LM-11B-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Delcos/Starling-LM-11B-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Delcos/Starling-LM-11B-alpha](https://huggingface.co/Delcos/Starling-LM-11B-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Delcos__Starling-LM-11B-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:46:23.982029](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Starling-LM-11B-alpha/blob/main/results_2023-12-09T16-46-23.982029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6362170386987497,
"acc_stderr": 0.03232328033089801,
"acc_norm": 0.6416906108621553,
"acc_norm_stderr": 0.03297213400326341,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5452023492477854,
"mc2_stderr": 0.016056772234309992
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946705,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.668990240987851,
"acc_stderr": 0.004696148339570979,
"acc_norm": 0.8485361481776539,
"acc_norm_stderr": 0.0035776774950640783
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895535,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895535
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276878,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276878
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407004,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407004
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608422,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608422
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.01651959427529712,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.01651959427529712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.02483605786829468,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.02483605786829468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532063,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360375,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5452023492477854,
"mc2_stderr": 0.016056772234309992
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.379833206974981,
"acc_stderr": 0.013368818096960498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Delcos__Starling-LM-11B-alpha | [
"region:us"
] | 2023-12-09T16:49:14+00:00 | {"pretty_name": "Evaluation run of Delcos/Starling-LM-11B-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [Delcos/Starling-LM-11B-alpha](https://huggingface.co/Delcos/Starling-LM-11B-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Starling-LM-11B-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:46:23.982029](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Starling-LM-11B-alpha/blob/main/results_2023-12-09T16-46-23.982029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6362170386987497,\n \"acc_stderr\": 0.03232328033089801,\n \"acc_norm\": 0.6416906108621553,\n \"acc_norm_stderr\": 0.03297213400326341,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5452023492477854,\n \"mc2_stderr\": 0.016056772234309992\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946705,\n \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.668990240987851,\n \"acc_stderr\": 0.004696148339570979,\n \"acc_norm\": 0.8485361481776539,\n \"acc_norm_stderr\": 0.0035776774950640783\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895535,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895535\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276878,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276878\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.7309417040358744,\n \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407004,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407004\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608422,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608422\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.01651959427529712,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.01651959427529712\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.02483605786829468,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.02483605786829468\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532063,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532063\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360375,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360375\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5452023492477854,\n \"mc2_stderr\": 0.016056772234309992\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.379833206974981,\n \"acc_stderr\": 0.013368818096960498\n }\n}\n```", "repo_url": "https://huggingface.co/Delcos/Starling-LM-11B-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-46-23.982029.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["**/details_harness|winogrande|5_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-46-23.982029.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_46_23.982029", "path": ["results_2023-12-09T16-46-23.982029.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-46-23.982029.parquet"]}]}]} | 2023-12-09T16:49:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Delcos/Starling-LM-11B-alpha
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Delcos/Starling-LM-11B-alpha on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T16:46:23.982029(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Delcos/Starling-LM-11B-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Delcos/Starling-LM-11B-alp... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Delcos/Starling-LM-11B-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model D... | [
6,
21,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Delcos/Starling-LM-11B-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Delcos/Sta... |
4256fb07972e90f6c8b3ba1c9e23a424d26ebe03 |
# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:52:16.188783](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Ties/blob/main/results_2023-12-09T16-52-16.188783.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6262329269588028,
"acc_stderr": 0.03265531717656403,
"acc_norm": 0.6261458795179596,
"acc_norm_stderr": 0.033325096066245945,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5206285653012832,
"mc2_stderr": 0.015833320867777365
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6538538139812786,
"acc_stderr": 0.004747682003491466,
"acc_norm": 0.8234415455088627,
"acc_norm_stderr": 0.00380515334471309
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976054,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976054
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073336,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073336
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608408,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438899,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438899
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621358,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621358
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.01269157579265712,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.01269157579265712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5206285653012832,
"mc2_stderr": 0.015833320867777365
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.012824066621488836
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Ties | [
"region:us"
] | 2023-12-09T16:55:10+00:00 | {"pretty_name": "Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:52:16.188783](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Ties/blob/main/results_2023-12-09T16-52-16.188783.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6262329269588028,\n \"acc_stderr\": 0.03265531717656403,\n \"acc_norm\": 0.6261458795179596,\n \"acc_norm_stderr\": 0.033325096066245945,\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5206285653012832,\n \"mc2_stderr\": 0.015833320867777365\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6538538139812786,\n \"acc_stderr\": 0.004747682003491466,\n \"acc_norm\": 0.8234415455088627,\n \"acc_norm_stderr\": 0.00380515334471309\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976054,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976054\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073336,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073336\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621358,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621358\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n \"acc_stderr\": 0.01269157579265712,\n \"acc_norm\": 0.4445893089960887,\n \"acc_norm_stderr\": 0.01269157579265712\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5206285653012832,\n \"mc2_stderr\": 0.015833320867777365\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \"acc_stderr\": 0.012824066621488836\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-52-16.188783.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["**/details_harness|winogrande|5_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-52-16.188783.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_52_16.188783", "path": ["results_2023-12-09T16-52-16.188783.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-52-16.188783.parquet"]}]}]} | 2023-12-09T16:55:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T16:52:16.188783(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMat... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation ru... | [
6,
29,
31,
178,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Ties## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... |
a00e82546848083f5a58d20092cb82d97ca8f42c |
# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:53:19.272337](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp/blob/main/results_2023-12-09T16-53-19.272337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389439642939008,
"acc_stderr": 0.03231020427870188,
"acc_norm": 0.6389579295248086,
"acc_norm_stderr": 0.03297676323880707,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5522545162562386,
"mc2_stderr": 0.015322345793520823
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759093,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718164
},
"harness|hellaswag|10": {
"acc": 0.6550487950607449,
"acc_stderr": 0.004743808792037863,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.0036111673029597833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179602,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787684,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786547,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5522545162562386,
"mc2_stderr": 0.015322345793520823
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205083
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.01264354476287336
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp | [
"region:us"
] | 2023-12-09T16:56:08+00:00 | {"pretty_name": "Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp](https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:53:19.272337](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-neural-chat-7b-v3-2-Slerp/blob/main/results_2023-12-09T16-53-19.272337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389439642939008,\n \"acc_stderr\": 0.03231020427870188,\n \"acc_norm\": 0.6389579295248086,\n \"acc_norm_stderr\": 0.03297676323880707,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5522545162562386,\n \"mc2_stderr\": 0.015322345793520823\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759093,\n \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718164\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n \"acc_stderr\": 0.004743808792037863,\n \"acc_norm\": 0.8450507866958773,\n \"acc_norm_stderr\": 0.0036111673029597833\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.016513676031179602,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.016513676031179602\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n \"acc_stderr\": 0.012689708167787684,\n \"acc_norm\": 0.4439374185136897,\n \"acc_norm_stderr\": 0.012689708167787684\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681404,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681404\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786547,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786547\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5522545162562386,\n \"mc2_stderr\": 0.015322345793520823\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205083\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.01264354476287336\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["**/details_harness|winogrande|5_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-53-19.272337.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_53_19.272337", "path": ["results_2023-12-09T16-53-19.272337.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-53-19.272337.parquet"]}]}]} | 2023-12-09T16:56:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T16:53:19.272337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMa... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation r... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/MetaMath-neural-chat-7b-v3-2-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... |
fbbdf2d39eea62411eff3c1212f44928cb20c356 | # Dataset Card for "pile_dedupe_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | xwjiang2010/pile_dedupe_val | [
"region:us"
] | 2023-12-09T16:58:31+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6062337711, "num_examples": 1000000}], "download_size": 3343428302, "dataset_size": 6062337711}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-09T18:28:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pile_dedupe_val"
More Information needed | [
"# Dataset Card for \"pile_dedupe_val\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pile_dedupe_val\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pile_dedupe_val\"\n\nMore Information needed"
] |
d311cb08003cf168e22a52bdda3010ea94d8cdeb |
# Dataset Card for Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties](https://huggingface.co/Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-NeuralHermes-2.5-Mistral-7B-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:59:41.207552](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-NeuralHermes-2.5-Mistral-7B-Ties/blob/main/results_2023-12-09T16-59-41.207552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6257025979407843,
"acc_stderr": 0.03245342362812811,
"acc_norm": 0.6259954931770727,
"acc_norm_stderr": 0.03311192058156274,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.501521774455576,
"mc2_stderr": 0.01581364594434788
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.014346869060229315,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6485759808803028,
"acc_stderr": 0.004764393985111037,
"acc_norm": 0.828918542123083,
"acc_norm_stderr": 0.0037581050431501253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040696,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040696
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669963,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988626,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.501521774455576,
"mc2_stderr": 0.01581364594434788
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403108
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131709
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__MetaMath-NeuralHermes-2.5-Mistral-7B-Ties | [
"region:us"
] | 2023-12-09T17:02:33+00:00 | {"pretty_name": "Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties](https://huggingface.co/Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-NeuralHermes-2.5-Mistral-7B-Ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T16:59:41.207552](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-NeuralHermes-2.5-Mistral-7B-Ties/blob/main/results_2023-12-09T16-59-41.207552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6257025979407843,\n \"acc_stderr\": 0.03245342362812811,\n \"acc_norm\": 0.6259954931770727,\n \"acc_norm_stderr\": 0.03311192058156274,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.501521774455576,\n \"mc2_stderr\": 0.01581364594434788\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229315,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6485759808803028,\n \"acc_stderr\": 0.004764393985111037,\n \"acc_norm\": 0.828918542123083,\n \"acc_norm_stderr\": 0.0037581050431501253\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040696,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040696\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3575418994413408,\n \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.3575418994413408,\n \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669963,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988626,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988626\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.501521774455576,\n \"mc2_stderr\": 0.01581364594434788\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403108\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131709\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T16-59-41.207552.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["**/details_harness|winogrande|5_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T16-59-41.207552.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T16_59_41.207552", "path": ["results_2023-12-09T16-59-41.207552.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T16-59-41.207552.parquet"]}]}]} | 2023-12-09T17:03:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T16:59:41.207552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evalu... | [
6,
31,
31,
180,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/MetaMath-NeuralHermes-2.5-Mistral-7B-Ties## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run... |
2cc45015f5badebe0df62e77201687c79aa0a23a |
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mwitiderrick/open_llama_3b_instruct_v_0.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_instruct_v_0.2](https://huggingface.co/mwitiderrick/open_llama_3b_instruct_v_0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__open_llama_3b_instruct_v_0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:00:38.950221](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_instruct_v_0.2/blob/main/results_2023-12-09T17-00-38.950221.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26151949284222964,
"acc_stderr": 0.03090435201991025,
"acc_norm": 0.26267138346311003,
"acc_norm_stderr": 0.03166222554081637,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.3816212066330296,
"mc2_stderr": 0.013929117644890942
},
"harness|arc:challenge|25": {
"acc": 0.36177474402730375,
"acc_stderr": 0.014041957945038059,
"acc_norm": 0.3848122866894198,
"acc_norm_stderr": 0.014218371065251107
},
"harness|hellaswag|10": {
"acc": 0.4953196574387572,
"acc_stderr": 0.00498956279828052,
"acc_norm": 0.6676956781517626,
"acc_norm_stderr": 0.004700767741735565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754444,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.02136202772522271,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.02136202772522271
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715484,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715484
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.018224078117299085,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.018224078117299085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491666,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709215,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709215
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02905858830374884,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02905858830374884
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197797,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.025403832978179622,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.025403832978179622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.02657786094330786,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.02657786094330786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981634,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.3816212066330296,
"mc2_stderr": 0.013929117644890942
},
"harness|winogrande|5": {
"acc": 0.6345698500394633,
"acc_stderr": 0.013533965097638795
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723889946
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_mwitiderrick__open_llama_3b_instruct_v_0.2 | [
"region:us"
] | 2023-12-09T17:02:51+00:00 | {"pretty_name": "Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_instruct_v_0.2](https://huggingface.co/mwitiderrick/open_llama_3b_instruct_v_0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__open_llama_3b_instruct_v_0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:00:38.950221](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_instruct_v_0.2/blob/main/results_2023-12-09T17-00-38.950221.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26151949284222964,\n \"acc_stderr\": 0.03090435201991025,\n \"acc_norm\": 0.26267138346311003,\n \"acc_norm_stderr\": 0.03166222554081637,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.3816212066330296,\n \"mc2_stderr\": 0.013929117644890942\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36177474402730375,\n \"acc_stderr\": 0.014041957945038059,\n \"acc_norm\": 0.3848122866894198,\n \"acc_norm_stderr\": 0.014218371065251107\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4953196574387572,\n \"acc_stderr\": 0.00498956279828052,\n \"acc_norm\": 0.6676956781517626,\n \"acc_norm_stderr\": 0.004700767741735565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754444,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.02136202772522271,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.02136202772522271\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715484,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715484\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.018224078117299085,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.018224078117299085\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491666,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709215,\n \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709215\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.02905858830374884,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.02905858830374884\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n \"acc_stderr\": 0.016203792703197797,\n \"acc_norm\": 0.2886334610472541,\n \"acc_norm_stderr\": 0.016203792703197797\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.025403832978179622,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.025403832978179622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.02657786094330786,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.02657786094330786\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.010906282617981634,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.010906282617981634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.3816212066330296,\n \"mc2_stderr\": 0.013929117644890942\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6345698500394633,\n \"acc_stderr\": 0.013533965097638795\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.0034478192723889946\n }\n}\n```", "repo_url": "https://huggingface.co/mwitiderrick/open_llama_3b_instruct_v_0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-00-38.950221.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["**/details_harness|winogrande|5_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-00-38.950221.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_00_38.950221", "path": ["results_2023-12-09T17-00-38.950221.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-00-38.950221.parquet"]}]}]} | 2023-12-09T17:03:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_instruct_v_0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:00:38.950221(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mwitiderrick/... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation r... | [
6,
28,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_instruct_v_0.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... |
bba602698fd6f18226f6404fd8fb637ef67fa1eb |
# Financial Sentiment Analysis Dataset
## Overview
This dataset is a comprehensive collection of tweets focused on financial topics, meticulously curated to assist in sentiment analysis in the domain of finance and stock markets. It serves as a valuable resource for training machine learning models to understand and predict sentiment trends based on social media discourse, particularly within the financial sector.
## Data Description
The dataset comprises tweets related to financial markets, stocks, and economic discussions. Each tweet is labeled with a sentiment value, where '1' denotes a positive sentiment, '2' signifies a negative sentiment, and '0' indicates a neutral sentiment. The dataset has undergone thorough preprocessing, including sentiment mapping and the removal of duplicate entries, to ensure data quality and consistency.
### Dataset Structure
- **Tweet**: The text of the tweet, providing insights into financial discussions.
- **Sentiment**: A numerical label indicating the sentiment of the tweet (1 for bullish, 2 for bearish, and 0 for neutral).
## Dataset Size
- **Bullish Sentiments**: 17,368
- **Bearish Sentiments**: 8,542
- **Neutral Sentiments**: 12,181
## Sources
This dataset is an amalgamation of data from various reputable sources, each contributing a unique perspective on financial sentiment:
- [FIQA Sentiment Classification](https://huggingface.co/datasets/ChanceFocus/fiqa-sentiment-classification): A sentiment analysis dataset with 721 positive, 379 negative, and 11 neutral sentiments.
- [Stock Market Tweets Data](https://ieee-dataport.org/open-access/stock-market-tweets-data): A collection of tweets with 523 positive, 420 neutral, and 341 negative sentiments.
- [Stock Related Tweet Sentiment](https://www.kaggle.com/datasets/mattgilgo/stock-related-tweet-sentiment): A dataset featuring 5005 positive, 741 neutral, and 736 negative sentiments.
- [Master Thesis Data](https://github.com/moritzwilksch/MasterThesis/tree/main): Includes 3711 positive, 2784 neutral, and 2167 negative sentiments.
- [Twitter Stock Sentiment](https://github.com/poojathakoor/twitter-stock-sentiment): Comprises 702 positive, 595 negative, and 481 neutral sentiments.
- [Crypto Sentiment](https://github.com/surge-ai/crypto-sentiment/tree/main): Sentiment data for cryptocurrency-related tweets with 296 positive and 256 negative sentiments.
- [Stock Sentiment](https://github.com/surge-ai/stock-sentiment/tree/main): Sentiment analysis on stock-related tweets, including 327 positive and 173 negative sentiments.
- [Stockmarket Sentiment Dataset](https://www.kaggle.com/datasets/yash612/stockmarket-sentiment-dataset): Features 3685 positive and 2106 negative sentiments.
- [Twitter Financial News Sentiment](https://huggingface.co/datasets/zeroshot/twitter-financial-news-sentiment): Contains 2398 positive, 1789 negative, and 7744 neutral sentiments.
## Usage
This dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.
## Acknowledgments
We extend our heartfelt gratitude to all the authors and contributors of the original datasets. Their efforts in data collection and curation have been pivotal in creating this comprehensive resource.
## License
This dataset is made available under the MIT license, adhering to the licensing terms of the original datasets. | TimKoornstra/financial-tweets-sentiment | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"sentiment",
"twitter",
"finance",
"crypto",
"stocks",
"tweet",
"collection",
"region:us"
] | 2023-12-09T17:03:27+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "pretty_name": "Financial Tweets with Sentiment class", "dataset_info": {"features": [{"name": "tweet", "dtype": "string"}, {"name": "sentiment", "dtype": {"class_label": {"names": {"0": "neutral", "1": "bullish", "2": "bearish"}}}}, {"name": "url", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6848991, "num_examples": 38091}], "download_size": 2648082, "dataset_size": 6848991}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["sentiment", "twitter", "finance", "crypto", "stocks", "tweet", "collection"]} | 2023-12-20T11:04:21+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-mit #sentiment #twitter #finance #crypto #stocks #tweet #collection #region-us
|
# Financial Sentiment Analysis Dataset
## Overview
This dataset is a comprehensive collection of tweets focused on financial topics, meticulously curated to assist in sentiment analysis in the domain of finance and stock markets. It serves as a valuable resource for training machine learning models to understand and predict sentiment trends based on social media discourse, particularly within the financial sector.
## Data Description
The dataset comprises tweets related to financial markets, stocks, and economic discussions. Each tweet is labeled with a sentiment value, where '1' denotes a positive sentiment, '2' signifies a negative sentiment, and '0' indicates a neutral sentiment. The dataset has undergone thorough preprocessing, including sentiment mapping and the removal of duplicate entries, to ensure data quality and consistency.
### Dataset Structure
- Tweet: The text of the tweet, providing insights into financial discussions.
- Sentiment: A numerical label indicating the sentiment of the tweet (1 for bullish, 2 for bearish, and 0 for neutral).
## Dataset Size
- Bullish Sentiments: 17,368
- Bearish Sentiments: 8,542
- Neutral Sentiments: 12,181
## Sources
This dataset is an amalgamation of data from various reputable sources, each contributing a unique perspective on financial sentiment:
- FIQA Sentiment Classification: A sentiment analysis dataset with 721 positive, 379 negative, and 11 neutral sentiments.
- Stock Market Tweets Data: A collection of tweets with 523 positive, 420 neutral, and 341 negative sentiments.
- Stock Related Tweet Sentiment: A dataset featuring 5005 positive, 741 neutral, and 736 negative sentiments.
- Master Thesis Data: Includes 3711 positive, 2784 neutral, and 2167 negative sentiments.
- Twitter Stock Sentiment: Comprises 702 positive, 595 negative, and 481 neutral sentiments.
- Crypto Sentiment: Sentiment data for cryptocurrency-related tweets with 296 positive and 256 negative sentiments.
- Stock Sentiment: Sentiment analysis on stock-related tweets, including 327 positive and 173 negative sentiments.
- Stockmarket Sentiment Dataset: Features 3685 positive and 2106 negative sentiments.
- Twitter Financial News Sentiment: Contains 2398 positive, 1789 negative, and 7744 neutral sentiments.
## Usage
This dataset is ideal for training and evaluating machine learning models for sentiment analysis, especially those focused on understanding market trends and investor sentiment. It can be used for academic research, financial market analysis, and developing AI tools for financial institutions.
## Acknowledgments
We extend our heartfelt gratitude to all the authors and contributors of the original datasets. Their efforts in data collection and curation have been pivotal in creating this comprehensive resource.
## License
This dataset is made available under the MIT license, adhering to the licensing terms of the original datasets. | [
"# Financial Sentiment Analysis Dataset",
"## Overview\nThis dataset is a comprehensive collection of tweets focused on financial topics, meticulously curated to assist in sentiment analysis in the domain of finance and stock markets. It serves as a valuable resource for training machine learning models to unders... | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-mit #sentiment #twitter #finance #crypto #stocks #tweet #collection #region-us \n",
"# Financial Sentiment Analysis Dataset",
"## Overview\nThis dataset is a comprehensive collection of tweets focused on financial... | [
57,
8,
71,
98,
54,
30,
250,
53,
47,
29
] | [
"passage: TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-mit #sentiment #twitter #finance #crypto #stocks #tweet #collection #region-us \n# Financial Sentiment Analysis Dataset## Overview\nThis dataset is a comprehensive collection of tweets focused on financial to... |
a82b439c4c9afe4290f4517c7a764a75a2501e3c |
# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aloobun/open-llama-3b-v2-elmv3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aloobun/open-llama-3b-v2-elmv3](https://huggingface.co/aloobun/open-llama-3b-v2-elmv3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:25:59.224844](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3/blob/main/results_2023-12-09T18-25-59.224844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2804692579613333,
"acc_stderr": 0.03160774886030324,
"acc_norm": 0.28199113779250456,
"acc_norm_stderr": 0.0323576565422058,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3550624387136162,
"mc2_stderr": 0.01364292328900912
},
"harness|arc:challenge|25": {
"acc": 0.3873720136518771,
"acc_stderr": 0.014235872487909874,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.551185022903804,
"acc_stderr": 0.004963567029129055,
"acc_norm": 0.7326229834694284,
"acc_norm_stderr": 0.004416861919100999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843673,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843673
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325628,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333339,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333339
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.02300062824368796,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.02300062824368796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094476,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094476
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127532,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127532
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843017,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843017
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827065,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827065
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3550624387136162,
"mc2_stderr": 0.01364292328900912
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.013409047676670184
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.0052095162830737675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3 | [
"region:us"
] | 2023-12-09T17:20:39+00:00 | {"pretty_name": "Evaluation run of aloobun/open-llama-3b-v2-elmv3", "dataset_summary": "Dataset automatically created during the evaluation run of model [aloobun/open-llama-3b-v2-elmv3](https://huggingface.co/aloobun/open-llama-3b-v2-elmv3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T18:25:59.224844](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__open-llama-3b-v2-elmv3/blob/main/results_2023-12-09T18-25-59.224844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2804692579613333,\n \"acc_stderr\": 0.03160774886030324,\n \"acc_norm\": 0.28199113779250456,\n \"acc_norm_stderr\": 0.0323576565422058,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3550624387136162,\n \"mc2_stderr\": 0.01364292328900912\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3873720136518771,\n \"acc_stderr\": 0.014235872487909874,\n \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326023\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.551185022903804,\n \"acc_stderr\": 0.004963567029129055,\n \"acc_norm\": 0.7326229834694284,\n \"acc_norm_stderr\": 0.004416861919100999\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843673,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843673\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747549,\n \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747549\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325628,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325628\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333339,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333339\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.02300062824368796,\n \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.02300062824368796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.37668161434977576,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n \"acc_stderr\": 0.016160871405127532,\n \"acc_norm\": 0.28607918263090676,\n \"acc_norm_stderr\": 0.016160871405127532\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843017,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843017\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n \"acc_stderr\": 0.010936550813827065,\n \"acc_norm\": 0.24185136897001303,\n \"acc_norm_stderr\": 0.010936550813827065\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049163,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3550624387136162,\n \"mc2_stderr\": 0.01364292328900912\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.013409047676670184\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \"acc_stderr\": 0.0052095162830737675\n }\n}\n```", "repo_url": "https://huggingface.co/aloobun/open-llama-3b-v2-elmv3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-18-30.999840.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["**/details_harness|winogrande|5_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["**/details_harness|winogrande|5_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T18-25-59.224844.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_18_30.999840", "path": ["results_2023-12-09T17-18-30.999840.parquet"]}, {"split": "2023_12_09T18_25_59.224844", "path": ["results_2023-12-09T18-25-59.224844.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T18-25-59.224844.parquet"]}]}]} | 2023-12-09T18:28:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model aloobun/open-llama-3b-v2-elmv3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T18:25:59.224844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model aloobun/open-llama-3b-v2... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model... | [
6,
25,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aloobun/open-llama-3b-v2-elmv3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aloobun/... |
ec601e6256d2cf3b165370b2377774952bbf7738 |
# Dataset Card for Evaluation run of yyjjtt/test-model
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yyjjtt/test-model
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yyjjtt/test-model](https://huggingface.co/yyjjtt/test-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yyjjtt__test-model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:29:33.707881](https://huggingface.co/datasets/open-llm-leaderboard/details_yyjjtt__test-model/blob/main/results_2023-12-09T17-29-33.707881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2582521615185269,
"acc_stderr": 0.030847868754913528,
"acc_norm": 0.2593091182470286,
"acc_norm_stderr": 0.03166965639566685,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.44593057174416123,
"mc2_stderr": 0.015586502428911173
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702247,
"acc_norm": 0.2440273037542662,
"acc_norm_stderr": 0.012551447627856262
},
"harness|hellaswag|10": {
"acc": 0.2876916948814977,
"acc_stderr": 0.004517614647703248,
"acc_norm": 0.30173272256522604,
"acc_norm_stderr": 0.0045807181159925135
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.039725528847851375,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.039725528847851375
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882921,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882921
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708083,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708083
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.02136202772522273,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.02136202772522273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507384,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507384
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958945,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.01998782906975001,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.01998782906975001
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501967,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501967
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668907,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668907
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888496,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714857,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.01663931935031326,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.01663931935031326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984925,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984925
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.44593057174416123,
"mc2_stderr": 0.015586502428911173
},
"harness|winogrande|5": {
"acc": 0.5082872928176796,
"acc_stderr": 0.014050555322824189
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_yyjjtt__test-model | [
"region:us"
] | 2023-12-09T17:32:32+00:00 | {"pretty_name": "Evaluation run of yyjjtt/test-model", "dataset_summary": "Dataset automatically created during the evaluation run of model [yyjjtt/test-model](https://huggingface.co/yyjjtt/test-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yyjjtt__test-model\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:29:33.707881](https://huggingface.co/datasets/open-llm-leaderboard/details_yyjjtt__test-model/blob/main/results_2023-12-09T17-29-33.707881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2582521615185269,\n \"acc_stderr\": 0.030847868754913528,\n \"acc_norm\": 0.2593091182470286,\n \"acc_norm_stderr\": 0.03166965639566685,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.44593057174416123,\n \"mc2_stderr\": 0.015586502428911173\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702247,\n \"acc_norm\": 0.2440273037542662,\n \"acc_norm_stderr\": 0.012551447627856262\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2876916948814977,\n \"acc_stderr\": 0.004517614647703248,\n \"acc_norm\": 0.30173272256522604,\n \"acc_norm_stderr\": 0.0045807181159925135\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882921,\n \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882921\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708083,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708083\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198913,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198913\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.02136202772522273,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.02136202772522273\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507384,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507384\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958945,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958945\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3192660550458716,\n \"acc_stderr\": 0.01998782906975001,\n \"acc_norm\": 0.3192660550458716,\n \"acc_norm_stderr\": 0.01998782906975001\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501967,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501967\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.20085470085470086,\n \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n \"acc_stderr\": 0.024406162094668907,\n \"acc_norm\": 0.24437299035369775,\n \"acc_norm_stderr\": 0.024406162094668907\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n \"acc_stderr\": 0.010792595553888496,\n \"acc_norm\": 0.23272490221642764,\n \"acc_norm_stderr\": 0.010792595553888496\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714857,\n \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714857\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.01663931935031326,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.01663931935031326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n \"acc_stderr\": 0.03764425585984925,\n \"acc_norm\": 0.19090909090909092,\n \"acc_norm_stderr\": 0.03764425585984925\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.02540930195322568,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.02540930195322568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.44593057174416123,\n \"mc2_stderr\": 0.015586502428911173\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5082872928176796,\n \"acc_stderr\": 0.014050555322824189\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/yyjjtt/test-model", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["**/details_harness|winogrande|5_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-29-33.707881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_29_33.707881", "path": ["results_2023-12-09T17-29-33.707881.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-29-33.707881.parquet"]}]}]} | 2023-12-09T17:33:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yyjjtt/test-model
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model yyjjtt/test-model on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:29:33.707881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of yyjjtt/test-model",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model yyjjtt/test-model on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yyjjtt/test-model",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model yyjjtt/test-... | [
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yyjjtt/test-model## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yyjjtt/test-model on ... |
76adcae79aaf7dcb15551fe88274c600f1e2b55a |
# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AIDC-ai-business/Marcoroni-7B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-7B-v2](https://huggingface.co/AIDC-ai-business/Marcoroni-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:37:34.905167](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B-v2/blob/main/results_2023-12-09T17-37-34.905167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6381884045265307,
"acc_stderr": 0.03230282577031498,
"acc_norm": 0.6386105198682612,
"acc_norm_stderr": 0.03296099199433783,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6195852908845921,
"mc2_stderr": 0.015562566424717855
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6810396335391357,
"acc_stderr": 0.00465121131163384,
"acc_norm": 0.8626767576180043,
"acc_norm_stderr": 0.0034348485253881847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064076,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064076
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281235,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155814,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669968,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6195852908845921,
"mc2_stderr": 0.015562566424717855
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.01121862997251532
},
"harness|gsm8k|5": {
"acc": 0.6550416982562547,
"acc_stderr": 0.013093630133666247
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B-v2 | [
"region:us"
] | 2023-12-09T17:40:28+00:00 | {"pretty_name": "Evaluation run of AIDC-ai-business/Marcoroni-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-7B-v2](https://huggingface.co/AIDC-ai-business/Marcoroni-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:37:34.905167](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B-v2/blob/main/results_2023-12-09T17-37-34.905167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6381884045265307,\n \"acc_stderr\": 0.03230282577031498,\n \"acc_norm\": 0.6386105198682612,\n \"acc_norm_stderr\": 0.03296099199433783,\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6195852908845921,\n \"mc2_stderr\": 0.015562566424717855\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6810396335391357,\n \"acc_stderr\": 0.00465121131163384,\n \"acc_norm\": 0.8626767576180043,\n \"acc_norm_stderr\": 0.0034348485253881847\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281235,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n \"acc_stderr\": 0.016295332328155814,\n \"acc_norm\": 0.3877094972067039,\n \"acc_norm_stderr\": 0.016295332328155814\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669968,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6195852908845921,\n \"mc2_stderr\": 0.015562566424717855\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.01121862997251532\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6550416982562547,\n \"acc_stderr\": 0.013093630133666247\n }\n}\n```", "repo_url": "https://huggingface.co/AIDC-ai-business/Marcoroni-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-37-34.905167.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["**/details_harness|winogrande|5_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-37-34.905167.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_37_34.905167", "path": ["results_2023-12-09T17-37-34.905167.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-37-34.905167.parquet"]}]}]} | 2023-12-09T17:41:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AIDC-ai-business/Marcoroni-7B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:37:34.905167(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AIDC-ai-business/Marco... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AIDC-a... |
408f4d882191c2d72774f6e56300527ec4d4a2b2 |
# Dataset Card for Evaluation run of itsliupeng/openllama-7b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/openllama-7b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/openllama-7b-base](https://huggingface.co/itsliupeng/openllama-7b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__openllama-7b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:41:52.346369](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__openllama-7b-base/blob/main/results_2023-12-09T17-41-52.346369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42989152566033884,
"acc_stderr": 0.03449698744058074,
"acc_norm": 0.43443471590575655,
"acc_norm_stderr": 0.03530126937236681,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.3664912047351792,
"mc2_stderr": 0.01364656500793206
},
"harness|arc:challenge|25": {
"acc": 0.44197952218430037,
"acc_stderr": 0.014512682523128343,
"acc_norm": 0.4616040955631399,
"acc_norm_stderr": 0.01456824555029636
},
"harness|hellaswag|10": {
"acc": 0.5703047201752639,
"acc_stderr": 0.004940208641372079,
"acc_norm": 0.7639912368054173,
"acc_norm_stderr": 0.0042375981420072475
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4830188679245283,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.4830188679245283,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.0233306540545359,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.0233306540545359
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.028343787250540618,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.028343787250540618
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987837,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.021274713073954565,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.021274713073954565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.032335327775334835,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.032335327775334835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5242718446601942,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.5242718446601942,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.03193705726200293,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.03193705726200293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.017570705239256558,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.017570705239256558
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089786,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089786
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45980707395498394,
"acc_stderr": 0.028306190403305693,
"acc_norm": 0.45980707395498394,
"acc_norm_stderr": 0.028306190403305693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327242,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327242
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3389830508474576,
"acc_stderr": 0.012089941857584477,
"acc_norm": 0.3389830508474576,
"acc_norm_stderr": 0.012089941857584477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.02993534270787775,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.02993534270787775
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4133986928104575,
"acc_stderr": 0.019922115682786682,
"acc_norm": 0.4133986928104575,
"acc_norm_stderr": 0.019922115682786682
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.03762738699917057,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.03762738699917057
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862677,
"mc2": 0.3664912047351792,
"mc2_stderr": 0.01364656500793206
},
"harness|winogrande|5": {
"acc": 0.7087608524072613,
"acc_stderr": 0.012769029305370702
},
"harness|gsm8k|5": {
"acc": 0.09628506444275967,
"acc_stderr": 0.008125264128215908
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_itsliupeng__openllama-7b-base | [
"region:us"
] | 2023-12-09T17:44:02+00:00 | {"pretty_name": "Evaluation run of itsliupeng/openllama-7b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [itsliupeng/openllama-7b-base](https://huggingface.co/itsliupeng/openllama-7b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__openllama-7b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:41:52.346369](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__openllama-7b-base/blob/main/results_2023-12-09T17-41-52.346369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42989152566033884,\n \"acc_stderr\": 0.03449698744058074,\n \"acc_norm\": 0.43443471590575655,\n \"acc_norm_stderr\": 0.03530126937236681,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.3664912047351792,\n \"mc2_stderr\": 0.01364656500793206\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44197952218430037,\n \"acc_stderr\": 0.014512682523128343,\n \"acc_norm\": 0.4616040955631399,\n \"acc_norm_stderr\": 0.01456824555029636\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5703047201752639,\n \"acc_stderr\": 0.004940208641372079,\n \"acc_norm\": 0.7639912368054173,\n \"acc_norm_stderr\": 0.0042375981420072475\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4830188679245283,\n \"acc_stderr\": 0.030755120364119905,\n \"acc_norm\": 0.4830188679245283,\n \"acc_norm_stderr\": 0.030755120364119905\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236784,\n \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236784\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.28835978835978837,\n \"acc_stderr\": 0.0233306540545359,\n \"acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.0233306540545359\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45806451612903226,\n \"acc_stderr\": 0.028343787250540618,\n \"acc_norm\": 0.45806451612903226,\n \"acc_norm_stderr\": 0.028343787250540618\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987837,\n \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987837\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5614678899082569,\n \"acc_stderr\": 0.021274713073954565,\n \"acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.021274713073954565\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.03492406104163613,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.03492406104163613\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.4260089686098655,\n \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5242718446601942,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.5242718446601942,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.03193705726200293,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.03193705726200293\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.017570705239256558,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.017570705239256558\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.02691189868637792,\n \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.02691189868637792\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089786,\n \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089786\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45980707395498394,\n \"acc_stderr\": 0.028306190403305693,\n \"acc_norm\": 0.45980707395498394,\n \"acc_norm_stderr\": 0.028306190403305693\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327242,\n \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327242\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3389830508474576,\n \"acc_stderr\": 0.012089941857584477,\n \"acc_norm\": 0.3389830508474576,\n \"acc_norm_stderr\": 0.012089941857584477\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.02993534270787775,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.02993534270787775\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4133986928104575,\n \"acc_stderr\": 0.019922115682786682,\n \"acc_norm\": 0.4133986928104575,\n \"acc_norm_stderr\": 0.019922115682786682\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.031891418324213966,\n \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.031891418324213966\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.03762738699917057,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.03762738699917057\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862677,\n \"mc2\": 0.3664912047351792,\n \"mc2_stderr\": 0.01364656500793206\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7087608524072613,\n \"acc_stderr\": 0.012769029305370702\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \"acc_stderr\": 0.008125264128215908\n }\n}\n```", "repo_url": "https://huggingface.co/itsliupeng/openllama-7b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-41-52.346369.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["**/details_harness|winogrande|5_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-41-52.346369.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_41_52.346369", "path": ["results_2023-12-09T17-41-52.346369.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-41-52.346369.parquet"]}]}]} | 2023-12-09T17:44:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of itsliupeng/openllama-7b-base
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model itsliupeng/openllama-7b-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:41:52.346369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of itsliupeng/openllama-7b-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model itsliupeng/openllama-7b-ba... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of itsliupeng/openllama-7b-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model i... | [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of itsliupeng/openllama-7b-base## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model itsliupeng... |
e1bf464cf8dc377e784dba4b0fc84b0165c95975 |
# Dataset Card for Evaluation run of itsliupeng/openllama-7b-icl
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/openllama-7b-icl
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/openllama-7b-icl](https://huggingface.co/itsliupeng/openllama-7b-icl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__openllama-7b-icl",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:48:05.024924](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__openllama-7b-icl/blob/main/results_2023-12-09T17-48-05.024924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44441569324312047,
"acc_stderr": 0.03430171403503658,
"acc_norm": 0.4497976132436587,
"acc_norm_stderr": 0.035089311320789345,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3706359177223847,
"mc2_stderr": 0.01391522805511699
},
"harness|arc:challenge|25": {
"acc": 0.44197952218430037,
"acc_stderr": 0.014512682523128345,
"acc_norm": 0.47952218430034127,
"acc_norm_stderr": 0.014599131353035007
},
"harness|hellaswag|10": {
"acc": 0.5676160127464649,
"acc_stderr": 0.0049439450696114546,
"acc_norm": 0.7703644692292372,
"acc_norm_stderr": 0.004197388626940065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045105,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.03115852213135778,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.03115852213135778
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45161290322580644,
"acc_stderr": 0.028310500348568392,
"acc_norm": 0.45161290322580644,
"acc_norm_stderr": 0.028310500348568392
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.616580310880829,
"acc_stderr": 0.03508984236295342,
"acc_norm": 0.616580310880829,
"acc_norm_stderr": 0.03508984236295342
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033158,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033158
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02534809746809784,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02534809746809784
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6110091743119266,
"acc_stderr": 0.020902300887392866,
"acc_norm": 0.6110091743119266,
"acc_norm_stderr": 0.020902300887392866
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605607,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605607
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5,
"acc_stderr": 0.03509312031717982,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03509312031717982
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.03355746535223265,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.03355746535223265
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456605,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456605
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6239316239316239,
"acc_stderr": 0.03173393632969481,
"acc_norm": 0.6239316239316239,
"acc_norm_stderr": 0.03173393632969481
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5977011494252874,
"acc_stderr": 0.01753529452906895,
"acc_norm": 0.5977011494252874,
"acc_norm_stderr": 0.01753529452906895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.026864624366756653,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.026864624366756653
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331161,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331161
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805413,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.02839677044411129,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.02839677044411129
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.02782021415859437,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.02782021415859437
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.012161417729749798,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.012161417729749798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877753,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877753
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.01998780976948207,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.01998780976948207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.03198761546763126,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.03198761546763126
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.035197027175769155,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.035197027175769155
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3706359177223847,
"mc2_stderr": 0.01391522805511699
},
"harness|winogrande|5": {
"acc": 0.7016574585635359,
"acc_stderr": 0.012858885010030421
},
"harness|gsm8k|5": {
"acc": 0.10993176648976498,
"acc_stderr": 0.008616195587865418
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_itsliupeng__openllama-7b-icl | [
"region:us"
] | 2023-12-09T17:50:15+00:00 | {"pretty_name": "Evaluation run of itsliupeng/openllama-7b-icl", "dataset_summary": "Dataset automatically created during the evaluation run of model [itsliupeng/openllama-7b-icl](https://huggingface.co/itsliupeng/openllama-7b-icl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__openllama-7b-icl\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:48:05.024924](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__openllama-7b-icl/blob/main/results_2023-12-09T17-48-05.024924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44441569324312047,\n \"acc_stderr\": 0.03430171403503658,\n \"acc_norm\": 0.4497976132436587,\n \"acc_norm_stderr\": 0.035089311320789345,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3706359177223847,\n \"mc2_stderr\": 0.01391522805511699\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44197952218430037,\n \"acc_stderr\": 0.014512682523128345,\n \"acc_norm\": 0.47952218430034127,\n \"acc_norm_stderr\": 0.014599131353035007\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5676160127464649,\n \"acc_stderr\": 0.0049439450696114546,\n \"acc_norm\": 0.7703644692292372,\n \"acc_norm_stderr\": 0.004197388626940065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.038073017265045105,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.038073017265045105\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.03115852213135778,\n \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.03115852213135778\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45161290322580644,\n \"acc_stderr\": 0.028310500348568392,\n \"acc_norm\": 0.45161290322580644,\n \"acc_norm_stderr\": 0.028310500348568392\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0390369864774844,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563918,\n \"acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563918\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.616580310880829,\n \"acc_stderr\": 0.03508984236295342,\n \"acc_norm\": 0.616580310880829,\n \"acc_norm_stderr\": 0.03508984236295342\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033158,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.024838811988033158\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02534809746809784,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02534809746809784\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6110091743119266,\n \"acc_stderr\": 0.020902300887392866,\n \"acc_norm\": 0.6110091743119266,\n \"acc_norm_stderr\": 0.020902300887392866\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605607,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605607\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03509312031717982,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03509312031717982\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6160337552742616,\n \"acc_stderr\": 0.031658678064106674,\n \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.031658678064106674\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n \"acc_stderr\": 0.03355746535223265,\n \"acc_norm\": 0.5022421524663677,\n \"acc_norm_stderr\": 0.03355746535223265\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456605,\n \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456605\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6239316239316239,\n \"acc_stderr\": 0.03173393632969481,\n \"acc_norm\": 0.6239316239316239,\n \"acc_norm_stderr\": 0.03173393632969481\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5977011494252874,\n \"acc_stderr\": 0.01753529452906895,\n \"acc_norm\": 0.5977011494252874,\n \"acc_norm_stderr\": 0.01753529452906895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756653,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756653\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331161,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331161\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805413,\n \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n \"acc_stderr\": 0.02839677044411129,\n \"acc_norm\": 0.49517684887459806,\n \"acc_norm_stderr\": 0.02839677044411129\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.02782021415859437,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.02782021415859437\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n \"acc_stderr\": 0.012161417729749798,\n \"acc_norm\": 0.3474576271186441,\n \"acc_norm_stderr\": 0.012161417729749798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877753,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877753\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42320261437908496,\n \"acc_stderr\": 0.01998780976948207,\n \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.01998780976948207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.03198761546763126,\n \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.03198761546763126\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.5472636815920398,\n \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3706359177223847,\n \"mc2_stderr\": 0.01391522805511699\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7016574585635359,\n \"acc_stderr\": 0.012858885010030421\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10993176648976498,\n \"acc_stderr\": 0.008616195587865418\n }\n}\n```", "repo_url": "https://huggingface.co/itsliupeng/openllama-7b-icl", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-48-05.024924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["**/details_harness|winogrande|5_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-48-05.024924.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_48_05.024924", "path": ["results_2023-12-09T17-48-05.024924.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-48-05.024924.parquet"]}]}]} | 2023-12-09T17:50:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of itsliupeng/openllama-7b-icl
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model itsliupeng/openllama-7b-icl on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:48:05.024924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of itsliupeng/openllama-7b-icl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model itsliupeng/openllama-7b-icl... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of itsliupeng/openllama-7b-icl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model it... | [
6,
21,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of itsliupeng/openllama-7b-icl## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model itsliupeng/... |
021e8182fecdc40586bdae190b6bf26e075afd17 |
# Dataset Card for Evaluation run of rwitz/go-bruins
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rwitz/go-bruins
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rwitz/go-bruins](https://huggingface.co/rwitz/go-bruins) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rwitz__go-bruins",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:56:51.445836](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz__go-bruins/blob/main/results_2023-12-09T17-56-51.445836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6537762475221197,
"acc_stderr": 0.03208085743053689,
"acc_norm": 0.6538246694322897,
"acc_norm_stderr": 0.032742779319017035,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5871006945090181,
"mc2_stderr": 0.015474717474561337
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205761,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344003
},
"harness|hellaswag|10": {
"acc": 0.6857199761003784,
"acc_stderr": 0.004632797375289765,
"acc_norm": 0.867257518422625,
"acc_norm_stderr": 0.0033860277997584177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579665,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579665
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5871006945090181,
"mc2_stderr": 0.015474717474561337
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.6990144048521607,
"acc_stderr": 0.012634504465211185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_rwitz__go-bruins | [
"region:us"
] | 2023-12-09T17:50:17+00:00 | {"pretty_name": "Evaluation run of rwitz/go-bruins", "dataset_summary": "Dataset automatically created during the evaluation run of model [rwitz/go-bruins](https://huggingface.co/rwitz/go-bruins) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rwitz__go-bruins\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:56:51.445836](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz__go-bruins/blob/main/results_2023-12-09T17-56-51.445836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537762475221197,\n \"acc_stderr\": 0.03208085743053689,\n \"acc_norm\": 0.6538246694322897,\n \"acc_norm_stderr\": 0.032742779319017035,\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5871006945090181,\n \"mc2_stderr\": 0.015474717474561337\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205761,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344003\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6857199761003784,\n \"acc_stderr\": 0.004632797375289765,\n \"acc_norm\": 0.867257518422625,\n \"acc_norm_stderr\": 0.0033860277997584177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579665,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579665\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5871006945090181,\n \"mc2_stderr\": 0.015474717474561337\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6990144048521607,\n \"acc_stderr\": 0.012634504465211185\n }\n}\n```", "repo_url": "https://huggingface.co/rwitz/go-bruins", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-47-26.960183.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-56-51.445836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["**/details_harness|winogrande|5_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["**/details_harness|winogrande|5_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-56-51.445836.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_47_26.960183", "path": ["results_2023-12-09T17-47-26.960183.parquet"]}, {"split": "2023_12_09T17_56_51.445836", "path": ["results_2023-12-09T17-56-51.445836.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-56-51.445836.parquet"]}]}]} | 2023-12-09T18:00:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rwitz/go-bruins
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model rwitz/go-bruins on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:56:51.445836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of rwitz/go-bruins",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-bruins on the Open LLM Leaderb... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rwitz/go-bruins",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-bruin... | [
6,
16,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rwitz/go-bruins## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-bruins on the ... |
f402965bf15d803974053361199318a9c25286bc |
# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:55:14.434225](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2023-12-09T17-55-14.434225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.639251601749628,
"acc_stderr": 0.03221647012444142,
"acc_norm": 0.6389576323016398,
"acc_norm_stderr": 0.03288102806405326,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.564970662967412,
"mc2_stderr": 0.015518503176886996
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6677952599083847,
"acc_stderr": 0.004700413824942566,
"acc_norm": 0.8516231826329417,
"acc_norm_stderr": 0.0035474663103253973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781874,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598559,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598559
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487043,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.564970662967412,
"mc2_stderr": 0.015518503176886996
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462063
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.012503592481818948
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp | [
"region:us"
] | 2023-12-09T17:58:06+00:00 | {"pretty_name": "Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:55:14.434225](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2023-12-09T17-55-14.434225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.639251601749628,\n \"acc_stderr\": 0.03221647012444142,\n \"acc_norm\": 0.6389576323016398,\n \"acc_norm_stderr\": 0.03288102806405326,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.564970662967412,\n \"mc2_stderr\": 0.015518503176886996\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6677952599083847,\n \"acc_stderr\": 0.004700413824942566,\n \"acc_norm\": 0.8516231826329417,\n \"acc_norm_stderr\": 0.0035474663103253973\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781874,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598559,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598559\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.564970662967412,\n \"mc2_stderr\": 0.015518503176886996\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462063\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \"acc_stderr\": 0.012503592481818948\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["**/details_harness|winogrande|5_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-55-14.434225.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_55_14.434225", "path": ["results_2023-12-09T17-55-14.434225.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-55-14.434225.parquet"]}]}]} | 2023-12-09T17:58:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/MetaMath-Tulpar-7b-v2-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:55:14.434225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PulsarAI/MetaMath-... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of... | [
6,
28,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Pu... |
86b9f45073ae0a9f432dff2f2649edea66b8086b |
# Dataset Card for Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp](https://huggingface.co/PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__MetaMath-Chupacabra-7B-v2.01-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:58:17.272756](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Chupacabra-7B-v2.01-Slerp/blob/main/results_2023-12-09T17-58-17.272756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6430394737674227,
"acc_stderr": 0.03225098588955544,
"acc_norm": 0.643238473261251,
"acc_norm_stderr": 0.03291299264153459,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5614591813728808,
"mc2_stderr": 0.015408154626799953
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131169,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6669986058554073,
"acc_stderr": 0.004703238534045804,
"acc_norm": 0.8546106353316073,
"acc_norm_stderr": 0.0035177257870177433
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919446,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169143,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169143
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5614591813728808,
"mc2_stderr": 0.015408154626799953
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.01135031570746206
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__MetaMath-Chupacabra-7B-v2.01-Slerp | [
"region:us"
] | 2023-12-09T18:01:07+00:00 | {"pretty_name": "Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp](https://huggingface.co/PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__MetaMath-Chupacabra-7B-v2.01-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T17:58:17.272756](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Chupacabra-7B-v2.01-Slerp/blob/main/results_2023-12-09T17-58-17.272756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6430394737674227,\n \"acc_stderr\": 0.03225098588955544,\n \"acc_norm\": 0.643238473261251,\n \"acc_norm_stderr\": 0.03291299264153459,\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5614591813728808,\n \"mc2_stderr\": 0.015408154626799953\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131169,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6669986058554073,\n \"acc_stderr\": 0.004703238534045804,\n \"acc_norm\": 0.8546106353316073,\n \"acc_norm_stderr\": 0.0035177257870177433\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919446,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169143,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169143\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5614591813728808,\n \"mc2_stderr\": 0.015408154626799953\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.01135031570746206\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \"acc_stderr\": 0.012607137125693625\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T17-58-17.272756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["**/details_harness|winogrande|5_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T17-58-17.272756.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T17_58_17.272756", "path": ["results_2023-12-09T17-58-17.272756.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T17-58-17.272756.parquet"]}]}]} | 2023-12-09T18:01:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T17:58:17.272756(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PulsarAI/Me... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/MetaMath-Chupacabra-7B-v2.01-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of m... |
c9c3d046e30c42df35edb5b67a2fae455501486f |
# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp](https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:04:51.228408](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-2-Slerp/blob/main/results_2023-12-09T18-04-51.228408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.644055937606071,
"acc_stderr": 0.032184807364406556,
"acc_norm": 0.6454677507073991,
"acc_norm_stderr": 0.03283460519387843,
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6104827225746667,
"mc2_stderr": 0.014972794318436832
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756557,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6569408484365664,
"acc_stderr": 0.0047376083401634,
"acc_norm": 0.8542123083051185,
"acc_norm_stderr": 0.0035217202839105555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598568,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6104827225746667,
"mc2_stderr": 0.014972794318436832
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.6307808946171342,
"acc_stderr": 0.013293019538066244
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-2-Slerp | [
"region:us"
] | 2023-12-09T18:07:42+00:00 | {"pretty_name": "Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp](https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-2-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T18:04:51.228408](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-2-Slerp/blob/main/results_2023-12-09T18-04-51.228408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.644055937606071,\n \"acc_stderr\": 0.032184807364406556,\n \"acc_norm\": 0.6454677507073991,\n \"acc_norm_stderr\": 0.03283460519387843,\n \"mc1\": 0.4504283965728274,\n \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6104827225746667,\n \"mc2_stderr\": 0.014972794318436832\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756557,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n \"acc_stderr\": 0.0047376083401634,\n \"acc_norm\": 0.8542123083051185,\n \"acc_norm_stderr\": 0.0035217202839105555\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598568,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598568\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4504283965728274,\n \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6104827225746667,\n \"mc2_stderr\": 0.014972794318436832\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6307808946171342,\n \"acc_stderr\": 0.013293019538066244\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["**/details_harness|winogrande|5_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T18-04-51.228408.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T18_04_51.228408", "path": ["results_2023-12-09T18-04-51.228408.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T18-04-51.228408.parquet"]}]}]} | 2023-12-09T18:08:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T18:04:51.228408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PulsarAI... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluat... | [
6,
30,
31,
179,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-2-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run o... |
e04d24c12154638ff6ce00cccb222946b48b303b | # Dataset Card for "rapidapi-example-responses"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses | [
"region:us"
] | 2023-12-09T18:25:26+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "api_name", "dtype": "string"}, {"name": "api_description", "dtype": "string"}, {"name": "api_score", "dtype": "float64"}, {"name": "endpoint_name", "dtype": "string"}, {"name": "endpoint_description", "dtype": "string"}, {"name": "response_status_code", "dtype": "int64"}, {"name": "response_summary", "dtype": "string"}, {"name": "response_json", "dtype": "string"}, {"name": "response_json_schema", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115936364, "num_examples": 28059}], "download_size": 27933521, "dataset_size": 115936364}} | 2023-12-10T11:16:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses\"\n\nMore Information needed"
] |
039426294a1c166522c7ca349782104640515b8d |
# Dataset Card for Evaluation run of abacusai/Giraffe-13b-32k-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/abacusai/Giraffe-13b-32k-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [abacusai/Giraffe-13b-32k-v3](https://huggingface.co/abacusai/Giraffe-13b-32k-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Giraffe-13b-32k-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:24:23.140202](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Giraffe-13b-32k-v3/blob/main/results_2023-12-09T18-24-23.140202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5497505679930987,
"acc_stderr": 0.033867836191543606,
"acc_norm": 0.554916151896208,
"acc_norm_stderr": 0.03459261280501611,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4667915335080478,
"mc2_stderr": 0.014974105305176868
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211676,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.014370358632472444
},
"harness|hellaswag|10": {
"acc": 0.5978888667596096,
"acc_stderr": 0.0048932206350117925,
"acc_norm": 0.795857398924517,
"acc_norm_stderr": 0.004022499210760734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537316,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537316
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028428,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028428
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147124,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147124
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.02766618207553965,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.02766618207553965
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412202,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412202
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871616,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871616
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008732,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008732
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7292464878671775,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.7292464878671775,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236855,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39504563233376794,
"acc_stderr": 0.012485727813251562,
"acc_norm": 0.39504563233376794,
"acc_norm_stderr": 0.012485727813251562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4667915335080478,
"mc2_stderr": 0.014974105305176868
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
},
"harness|gsm8k|5": {
"acc": 0.26156178923426837,
"acc_stderr": 0.012105605733382444
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_abacusai__Giraffe-13b-32k-v3 | [
"region:us"
] | 2023-12-09T18:27:19+00:00 | {"pretty_name": "Evaluation run of abacusai/Giraffe-13b-32k-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/Giraffe-13b-32k-v3](https://huggingface.co/abacusai/Giraffe-13b-32k-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Giraffe-13b-32k-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T18:24:23.140202](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Giraffe-13b-32k-v3/blob/main/results_2023-12-09T18-24-23.140202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5497505679930987,\n \"acc_stderr\": 0.033867836191543606,\n \"acc_norm\": 0.554916151896208,\n \"acc_norm_stderr\": 0.03459261280501611,\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4667915335080478,\n \"mc2_stderr\": 0.014974105305176868\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211676,\n \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472444\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5978888667596096,\n \"acc_stderr\": 0.0048932206350117925,\n \"acc_norm\": 0.795857398924517,\n \"acc_norm_stderr\": 0.004022499210760734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537316,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537316\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028428,\n \"acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028428\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147124,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147124\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n \"acc_stderr\": 0.02766618207553965,\n \"acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.02766618207553965\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412202,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412202\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871616,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871616\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008732,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008732\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7292464878671775,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.7292464878671775,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236855,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39504563233376794,\n \"acc_stderr\": 0.012485727813251562,\n \"acc_norm\": 0.39504563233376794,\n \"acc_norm_stderr\": 0.012485727813251562\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.03036544647727568,\n \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.03036544647727568\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4667915335080478,\n \"mc2_stderr\": 0.014974105305176868\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26156178923426837,\n \"acc_stderr\": 0.012105605733382444\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/Giraffe-13b-32k-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-24-23.140202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["**/details_harness|winogrande|5_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T18-24-23.140202.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T18_24_23.140202", "path": ["results_2023-12-09T18-24-23.140202.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T18-24-23.140202.parquet"]}]}]} | 2023-12-09T18:28:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abacusai/Giraffe-13b-32k-v3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model abacusai/Giraffe-13b-32k-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T18:24:23.140202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of abacusai/Giraffe-13b-32k-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model abacusai/Giraffe-13b-32k-v3... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abacusai/Giraffe-13b-32k-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ab... | [
6,
23,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abacusai/Giraffe-13b-32k-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model abacusai/Gi... |
7a924e8fa53d351c4d0759cc56d9d218ff9cfc9e |
# Dataset Card for Evaluation run of amazingvince/where-llambo-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/amazingvince/where-llambo-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [amazingvince/where-llambo-7b](https://huggingface.co/amazingvince/where-llambo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amazingvince__where-llambo-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:44:39.604520](https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__where-llambo-7b/blob/main/results_2023-12-09T18-44-39.604520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6276007814719067,
"acc_stderr": 0.03245983620498288,
"acc_norm": 0.6287066769044074,
"acc_norm_stderr": 0.03312214889081226,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.4961220088630948,
"mc2_stderr": 0.014820546287012869
},
"harness|arc:challenge|25": {
"acc": 0.5452218430034129,
"acc_stderr": 0.014551507060836357,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216386
},
"harness|hellaswag|10": {
"acc": 0.612427803226449,
"acc_stderr": 0.004862003566798543,
"acc_norm": 0.8205536745668194,
"acc_norm_stderr": 0.00382941380511398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876173,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876173
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010076,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.02957326913441112,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.02957326913441112
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786565,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786565
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.4961220088630948,
"mc2_stderr": 0.014820546287012869
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345402
},
"harness|gsm8k|5": {
"acc": 0.6520090978013646,
"acc_stderr": 0.013120581030382134
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_amazingvince__where-llambo-7b | [
"region:us"
] | 2023-12-09T18:47:31+00:00 | {"pretty_name": "Evaluation run of amazingvince/where-llambo-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [amazingvince/where-llambo-7b](https://huggingface.co/amazingvince/where-llambo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amazingvince__where-llambo-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T18:44:39.604520](https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__where-llambo-7b/blob/main/results_2023-12-09T18-44-39.604520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6276007814719067,\n \"acc_stderr\": 0.03245983620498288,\n \"acc_norm\": 0.6287066769044074,\n \"acc_norm_stderr\": 0.03312214889081226,\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.4961220088630948,\n \"mc2_stderr\": 0.014820546287012869\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836357,\n \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216386\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.612427803226449,\n \"acc_stderr\": 0.004862003566798543,\n \"acc_norm\": 0.8205536745668194,\n \"acc_norm_stderr\": 0.00382941380511398\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876173,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876173\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n \"acc_stderr\": 0.014854993938010076,\n \"acc_norm\": 0.27039106145251396,\n \"acc_norm_stderr\": 0.014854993938010076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.02957326913441112,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.02957326913441112\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786565,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786565\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.4961220088630948,\n \"mc2_stderr\": 0.014820546287012869\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6520090978013646,\n \"acc_stderr\": 0.013120581030382134\n }\n}\n```", "repo_url": "https://huggingface.co/amazingvince/where-llambo-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["**/details_harness|winogrande|5_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T18-44-39.604520.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T18_44_39.604520", "path": ["results_2023-12-09T18-44-39.604520.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T18-44-39.604520.parquet"]}]}]} | 2023-12-09T18:48:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of amazingvince/where-llambo-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model amazingvince/where-llambo-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T18:44:39.604520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of amazingvince/where-llambo-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model amazingvince/where-llambo-... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of amazingvince/where-llambo-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model a... | [
6,
19,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of amazingvince/where-llambo-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model amazingvin... |
114b7df46a1df5842195d686215163f5a806111a |
# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1](https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:49:52.400292](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1/blob/main/results_2023-12-09T18-49-52.400292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2882361931913223,
"acc_stderr": 0.031895486998552665,
"acc_norm": 0.2903928342274915,
"acc_norm_stderr": 0.03267939625046512,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006535,
"mc2": 0.41227748774876055,
"mc2_stderr": 0.014572961912704371
},
"harness|arc:challenge|25": {
"acc": 0.3660409556313993,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.40187713310580203,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.5142401911969727,
"acc_stderr": 0.00498775731476984,
"acc_norm": 0.7007568213503286,
"acc_norm_stderr": 0.00456990648509029
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.030251237579213174,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.030251237579213174
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.020940481565334835,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.020940481565334835
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02144454730156047,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02144454730156047
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279496,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279496
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28990825688073396,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.28990825688073396,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340268,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.37292464878671777,
"acc_stderr": 0.017292868269453924,
"acc_norm": 0.37292464878671777,
"acc_norm_stderr": 0.017292868269453924
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210742,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3271604938271605,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.3271604938271605,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503793,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28096479791395046,
"acc_stderr": 0.011479684550077692,
"acc_norm": 0.28096479791395046,
"acc_norm_stderr": 0.011479684550077692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145287,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.033076159479790326,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.033076159479790326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006535,
"mc2": 0.41227748774876055,
"mc2_stderr": 0.014572961912704371
},
"harness|winogrande|5": {
"acc": 0.6503551696921863,
"acc_stderr": 0.013402073680850515
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1 | [
"region:us"
] | 2023-12-09T18:52:51+00:00 | {"pretty_name": "Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1](https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T18:49:52.400292](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1/blob/main/results_2023-12-09T18-49-52.400292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2882361931913223,\n \"acc_stderr\": 0.031895486998552665,\n \"acc_norm\": 0.2903928342274915,\n \"acc_norm_stderr\": 0.03267939625046512,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006535,\n \"mc2\": 0.41227748774876055,\n \"mc2_stderr\": 0.014572961912704371\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3660409556313993,\n \"acc_stderr\": 0.01407722310847014,\n \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.014327268614578274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5142401911969727,\n \"acc_stderr\": 0.00498775731476984,\n \"acc_norm\": 0.7007568213503286,\n \"acc_norm_stderr\": 0.00456990648509029\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.030251237579213174,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.030251237579213174\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.020940481565334835,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.020940481565334835\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624335,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624335\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178815,\n \"acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354115,\n \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354115\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02144454730156047,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02144454730156047\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279496,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279496\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28990825688073396,\n \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.28990825688073396,\n \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021898,\n \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021898\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.29596412556053814,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n \"acc_stderr\": 0.030463656747340268,\n \"acc_norm\": 0.3162393162393162,\n \"acc_norm_stderr\": 0.030463656747340268\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.37292464878671777,\n \"acc_stderr\": 0.017292868269453924,\n \"acc_norm\": 0.37292464878671777,\n \"acc_norm_stderr\": 0.017292868269453924\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.025305258131879716,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.025305258131879716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n \"acc_stderr\": 0.014593620923210742,\n \"acc_norm\": 0.2558659217877095,\n \"acc_norm_stderr\": 0.014593620923210742\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3271604938271605,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.3271604938271605,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28096479791395046,\n \"acc_stderr\": 0.011479684550077692,\n \"acc_norm\": 0.28096479791395046,\n \"acc_norm_stderr\": 0.011479684550077692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.018311653053648222,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.018311653053648222\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145287,\n \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145287\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n \"acc_stderr\": 0.033076159479790326,\n \"acc_norm\": 0.32338308457711445,\n \"acc_norm_stderr\": 0.033076159479790326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006535,\n \"mc2\": 0.41227748774876055,\n \"mc2_stderr\": 0.014572961912704371\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.013402073680850515\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n }\n}\n```", "repo_url": "https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["**/details_harness|winogrande|5_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T18-49-52.400292.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T18_49_52.400292", "path": ["results_2023-12-09T18-49-52.400292.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T18-49-52.400292.parquet"]}]}]} | 2023-12-09T18:53:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T18:49:52.400292(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mwitide... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evalua... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run ... |
7f5c8e586c00ebaabb396ba68b269ef15d068188 | # Dataset Card for "cds_both_balanced_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lhallee/cds_both_balanced_512 | [
"region:us"
] | 2023-12-09T19:04:19+00:00 | {"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "species", "dtype": "string"}, {"name": "CDS", "dtype": "string"}, {"name": "AA", "dtype": "string"}, {"name": "Label", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 1905721929, "num_examples": 3245094}], "download_size": 1707079967, "dataset_size": 1905721929}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-09T19:05:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cds_both_balanced_512"
More Information needed | [
"# Dataset Card for \"cds_both_balanced_512\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cds_both_balanced_512\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"cds_both_balanced_512\"\n\nMore Information needed"
] |
5f8d715eb506f24e3ed53db9ec8ed3c7fe52840e | # Dataset Card for "ccds_human_512.csv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lhallee/ccds_human_512 | [
"region:us"
] | 2023-12-09T19:07:57+00:00 | {"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "species", "dtype": "string"}, {"name": "CDS", "dtype": "string"}, {"name": "AA", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13364217, "num_examples": 20882}], "download_size": 12357139, "dataset_size": 13364217}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-09T19:07:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ccds_human_512.csv"
More Information needed | [
"# Dataset Card for \"ccds_human_512.csv\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ccds_human_512.csv\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ccds_human_512.csv\"\n\nMore Information needed"
] |
cb5a6e0799e65d81629735883a95363fe0cd168c | # Dataset Card for "ccds_mouse_512.csv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lhallee/ccds_mouse_512 | [
"region:us"
] | 2023-12-09T19:08:05+00:00 | {"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "species", "dtype": "string"}, {"name": "CDS", "dtype": "string"}, {"name": "AA", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10746756, "num_examples": 16628}], "download_size": 9922040, "dataset_size": 10746756}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-09T19:08:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ccds_mouse_512.csv"
More Information needed | [
"# Dataset Card for \"ccds_mouse_512.csv\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ccds_mouse_512.csv\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ccds_mouse_512.csv\"\n\nMore Information needed"
] |
3090a3b94ae0825ae5cb9c3baca9d21638ebb931 |

Inspired by the [trismegistus-project](https://huggingface.co/datasets/teknium/trismegistus-project) by teknium, I decided to build a high-quality dataset composed of some of the most important works for Western esoteric studies.
The dataset is currently compose of 20 carefully processed books of multiple authors in the field. The dataset lacks works on practical magic; this should be fixed in the coming versions of this dataset.
Pax! | alexandreteles/hermes-toth | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:agpl-3.0",
"spirituality",
"occultism",
"esoterism",
"region:us"
] | 2023-12-09T19:26:47+00:00 | {"language": ["en"], "license": "agpl-3.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"], "pretty_name": "Hermes Toth", "tags": ["spirituality", "occultism", "esoterism"]} | 2024-02-10T03:59:41+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-agpl-3.0 #spirituality #occultism #esoterism #region-us
|
!img
Inspired by the trismegistus-project by teknium, I decided to build a high-quality dataset composed of some of the most important works for Western esoteric studies.
The dataset is currently compose of 20 carefully processed books of multiple authors in the field. The dataset lacks works on practical magic; this should be fixed in the coming versions of this dataset.
Pax! | [] | [
"TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-agpl-3.0 #spirituality #occultism #esoterism #region-us \n"
] | [
53
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-English #license-agpl-3.0 #spirituality #occultism #esoterism #region-us \n"
] |
794434f35af33be65f4e36f695347a817848b861 |
# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Zardos/Kant-Test-0.1-Mistral-7B](https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:05:46.345175](https://huggingface.co/datasets/open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B/blob/main/results_2023-12-10T11-05-46.345175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253667303213345,
"acc_stderr": 0.03253315196101968,
"acc_norm": 0.6318205791064505,
"acc_norm_stderr": 0.033203023073407084,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.4940424629624919,
"mc2_stderr": 0.014891468326851799
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216388,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979275
},
"harness|hellaswag|10": {
"acc": 0.6352320254929297,
"acc_stderr": 0.004803812631994957,
"acc_norm": 0.828918542123083,
"acc_norm_stderr": 0.0037581050431501257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915435,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.01403694585038139,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.01403694585038139
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.016094338768474596,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.016094338768474596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.4940424629624919,
"mc2_stderr": 0.014891468326851799
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345396
},
"harness|gsm8k|5": {
"acc": 0.3115996967399545,
"acc_stderr": 0.012757375376754941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B | [
"region:us"
] | 2023-12-09T19:37:18+00:00 | {"pretty_name": "Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zardos/Kant-Test-0.1-Mistral-7B](https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T11:05:46.345175](https://huggingface.co/datasets/open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B/blob/main/results_2023-12-10T11-05-46.345175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253667303213345,\n \"acc_stderr\": 0.03253315196101968,\n \"acc_norm\": 0.6318205791064505,\n \"acc_norm_stderr\": 0.033203023073407084,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.4940424629624919,\n \"mc2_stderr\": 0.014891468326851799\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216388,\n \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n \"acc_stderr\": 0.004803812631994957,\n \"acc_norm\": 0.828918542123083,\n \"acc_norm_stderr\": 0.0037581050431501257\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915435,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915435\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.01403694585038139,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.01403694585038139\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.4940424629624919,\n \"mc2_stderr\": 0.014891468326851799\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3115996967399545,\n \"acc_stderr\": 0.012757375376754941\n }\n}\n```", "repo_url": "https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-34-29.855469.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-45-27.448654.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["**/details_harness|winogrande|5_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["**/details_harness|winogrande|5_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["**/details_harness|winogrande|5_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T11-05-46.345175.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T19_34_29.855469", "path": ["results_2023-12-09T19-34-29.855469.parquet"]}, {"split": "2023_12_09T19_45_27.448654", "path": ["results_2023-12-09T19-45-27.448654.parquet"]}, {"split": "2023_12_10T11_05_46.345175", "path": ["results_2023-12-10T11-05-46.345175.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T11-05-46.345175.parquet"]}]}]} | 2023-12-10T11:09:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Zardos/Kant-Test-0.1-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T11:05:46.345175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Zardos/Kant-Test-0.1-Mi... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
6,
23,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Zardos/... |
cab7e6da7727c300be5214f623f0101310a5ba20 |
# Dataset Card for Evaluation run of Sao10K/Venomia-m7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Venomia-m7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Venomia-m7](https://huggingface.co/Sao10K/Venomia-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Venomia-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:38:57.975905](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-m7/blob/main/results_2023-12-09T19-38-57.975905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5994561896653553,
"acc_stderr": 0.032914574924459074,
"acc_norm": 0.6051903322222639,
"acc_norm_stderr": 0.03358493802168653,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.49078721070216347,
"mc2_stderr": 0.015495976475887885
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.014097810678042194
},
"harness|hellaswag|10": {
"acc": 0.6635132443736308,
"acc_stderr": 0.004715419139697519,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.03074630074212451,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.03074630074212451
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443135,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.01541449448790323,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.01541449448790323
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457152,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457152
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.01264300462379021,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.01264300462379021
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215923,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.49078721070216347,
"mc2_stderr": 0.015495976475887885
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174782
},
"harness|gsm8k|5": {
"acc": 0.3237300985595148,
"acc_stderr": 0.01288824739737114
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Sao10K__Venomia-m7 | [
"region:us"
] | 2023-12-09T19:41:51+00:00 | {"pretty_name": "Evaluation run of Sao10K/Venomia-m7", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Venomia-m7](https://huggingface.co/Sao10K/Venomia-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Venomia-m7\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T19:38:57.975905](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-m7/blob/main/results_2023-12-09T19-38-57.975905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5994561896653553,\n \"acc_stderr\": 0.032914574924459074,\n \"acc_norm\": 0.6051903322222639,\n \"acc_norm_stderr\": 0.03358493802168653,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.49078721070216347,\n \"mc2_stderr\": 0.015495976475887885\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042194\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6635132443736308,\n \"acc_stderr\": 0.004715419139697519,\n \"acc_norm\": 0.8399721171081458,\n \"acc_norm_stderr\": 0.0036588262081016106\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.03074630074212451,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.03074630074212451\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443135,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.01541449448790323,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.01541449448790323\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457152,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457152\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n \"acc_stderr\": 0.01264300462379021,\n \"acc_norm\": 0.42959582790091266,\n \"acc_norm_stderr\": 0.01264300462379021\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215923,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215923\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.49078721070216347,\n \"mc2_stderr\": 0.015495976475887885\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174782\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3237300985595148,\n \"acc_stderr\": 0.01288824739737114\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Venomia-m7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["**/details_harness|winogrande|5_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T19-38-57.975905.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T19_38_57.975905", "path": ["results_2023-12-09T19-38-57.975905.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T19-38-57.975905.parquet"]}]}]} | 2023-12-09T19:42:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Venomia-m7
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Sao10K/Venomia-m7 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T19:38:57.975905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Sao10K/Venomia-m7",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Venomia-m7 on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Venomia-m7",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Venom... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Venomia-m7## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Venomia-m7 on ... |
2739c3884b8251ea1832b46d1319e1034d1c93e8 |
# Dataset Card for Evaluation run of ajibawa-2023/Code-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Code-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-13B](https://huggingface.co/ajibawa-2023/Code-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Code-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:40:16.694610](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-13B/blob/main/results_2023-12-09T19-40-16.694610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5315302469691541,
"acc_stderr": 0.0338171547995471,
"acc_norm": 0.5374650243523146,
"acc_norm_stderr": 0.034550805778528454,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.01598359510181139,
"mc2": 0.4246156253859874,
"mc2_stderr": 0.01586771249517698
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097665,
"acc_norm": 0.5733788395904437,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.6420035849432384,
"acc_stderr": 0.004784312972495391,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.003723897305645496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933882,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933882
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.016028295188992476,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.016028295188992476
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.01520103251252044,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.01520103251252044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011998,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778862,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703385,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703385
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5179738562091504,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.5179738562091504,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.01598359510181139,
"mc2": 0.4246156253859874,
"mc2_stderr": 0.01586771249517698
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.010812347283182974
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_ajibawa-2023__Code-13B | [
"region:us"
] | 2023-12-09T19:43:11+00:00 | {"pretty_name": "Evaluation run of ajibawa-2023/Code-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ajibawa-2023/Code-13B](https://huggingface.co/ajibawa-2023/Code-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Code-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T19:40:16.694610](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-13B/blob/main/results_2023-12-09T19-40-16.694610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5315302469691541,\n \"acc_stderr\": 0.0338171547995471,\n \"acc_norm\": 0.5374650243523146,\n \"acc_norm_stderr\": 0.034550805778528454,\n \"mc1\": 0.2962056303549572,\n \"mc1_stderr\": 0.01598359510181139,\n \"mc2\": 0.4246156253859874,\n \"mc2_stderr\": 0.01586771249517698\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097665,\n \"acc_norm\": 0.5733788395904437,\n \"acc_norm_stderr\": 0.014453185592920293\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6420035849432384,\n \"acc_stderr\": 0.004784312972495391,\n \"acc_norm\": 0.8328022306313483,\n \"acc_norm_stderr\": 0.003723897305645496\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933882,\n \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933882\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6844036697247706,\n \"acc_stderr\": 0.019926117513869666,\n \"acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.019926117513869666\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n \"acc_stderr\": 0.016028295188992476,\n \"acc_norm\": 0.7215836526181354,\n \"acc_norm_stderr\": 0.016028295188992476\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.01520103251252044,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.01520103251252044\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778862,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n \"acc_stderr\": 0.012555701346703385,\n \"acc_norm\": 0.408735332464146,\n \"acc_norm_stderr\": 0.012555701346703385\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5179738562091504,\n \"acc_stderr\": 0.020214761037872404,\n \"acc_norm\": 0.5179738562091504,\n \"acc_norm_stderr\": 0.020214761037872404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n \"mc1_stderr\": 0.01598359510181139,\n \"mc2\": 0.4246156253859874,\n \"mc2_stderr\": 0.01586771249517698\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \"acc_stderr\": 0.010812347283182974\n }\n}\n```", "repo_url": "https://huggingface.co/ajibawa-2023/Code-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["**/details_harness|winogrande|5_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T19-40-16.694610.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T19_40_16.694610", "path": ["results_2023-12-09T19-40-16.694610.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T19-40-16.694610.parquet"]}]}]} | 2023-12-09T19:43:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ajibawa-2023/Code-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ajibawa-2023/Code-13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T19:40:16.694610(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of ajibawa-2023/Code-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ajibawa-2023/Code-13B on the Open... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ajibawa-2023/Code-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ajibawa-... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ajibawa-2023/Code-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ajibawa-2023/Code... |
fcbbfad17efcfece0821d550ea5f348ba1a233c7 |
# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Neural-una-cybertron-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Neural-una-cybertron-7b](https://huggingface.co/PulsarAI/Neural-una-cybertron-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:49:04.690282](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b/blob/main/results_2023-12-09T19-49-04.690282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6303659109315263,
"acc_stderr": 0.032701507219088696,
"acc_norm": 0.6326609738082676,
"acc_norm_stderr": 0.033364878181962175,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6498823682901811,
"mc2_stderr": 0.01528184743332698
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820164,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6704839673371839,
"acc_stderr": 0.004690768393854475,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.0036111673029597625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155247,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155247
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790492,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790492
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381398,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.016223533510365113,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.016223533510365113
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388856,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388856
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623325,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623325
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6498823682901811,
"mc2_stderr": 0.01528184743332698
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920524
},
"harness|gsm8k|5": {
"acc": 0.5231235784685367,
"acc_stderr": 0.013757748544245336
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b | [
"region:us"
] | 2023-12-09T19:51:54+00:00 | {"pretty_name": "Evaluation run of PulsarAI/Neural-una-cybertron-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/Neural-una-cybertron-7b](https://huggingface.co/PulsarAI/Neural-una-cybertron-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T19:49:04.690282](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Neural-una-cybertron-7b/blob/main/results_2023-12-09T19-49-04.690282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6303659109315263,\n \"acc_stderr\": 0.032701507219088696,\n \"acc_norm\": 0.6326609738082676,\n \"acc_norm_stderr\": 0.033364878181962175,\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6498823682901811,\n \"mc2_stderr\": 0.01528184743332698\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820164,\n \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6704839673371839,\n \"acc_stderr\": 0.004690768393854475,\n \"acc_norm\": 0.8450507866958773,\n \"acc_norm_stderr\": 0.0036111673029597625\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155247,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155247\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790492,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790492\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381398,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381398\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n \"acc_stderr\": 0.016223533510365113,\n \"acc_norm\": 0.3787709497206704,\n \"acc_norm_stderr\": 0.016223533510365113\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388856,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388856\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623325,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623325\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6498823682901811,\n \"mc2_stderr\": 0.01528184743332698\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5231235784685367,\n \"acc_stderr\": 0.013757748544245336\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/Neural-una-cybertron-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["**/details_harness|winogrande|5_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T19-49-04.690282.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T19_49_04.690282", "path": ["results_2023-12-09T19-49-04.690282.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T19-49-04.690282.parquet"]}]}]} | 2023-12-09T19:52:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/Neural-una-cybertron-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T19:49:04.690282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PulsarAI/Neural-una-cy... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/Neural-una-cybertron-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Pulsar... |
fb64048b6224d11ca19043e17b1d7e55fa52c18b |
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.4-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T19:57:57.872670](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2/blob/main/results_2023-12-09T19-57-57.872670.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5440235971329553,
"acc_stderr": 0.03410726380039453,
"acc_norm": 0.5490928177495088,
"acc_norm_stderr": 0.03483965758622219,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5379290576758808,
"mc2_stderr": 0.01514579551273296
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.014585305840007105
},
"harness|hellaswag|10": {
"acc": 0.5582553276239793,
"acc_stderr": 0.004955798214513426,
"acc_norm": 0.7453694483170683,
"acc_norm_stderr": 0.004347629889040944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.02727389059430064,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.02727389059430064
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954915,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954915
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.01937943662891999,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.01937943662891999
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753088,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753088
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.026613350840261743,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.026613350840261743
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711903,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711903
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39374185136897,
"acc_stderr": 0.012478532272564447,
"acc_norm": 0.39374185136897,
"acc_norm_stderr": 0.012478532272564447
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.03000856284500348,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.03000856284500348
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355586,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355586
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5379290576758808,
"mc2_stderr": 0.01514579551273296
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998285
},
"harness|gsm8k|5": {
"acc": 0.25701288855193327,
"acc_stderr": 0.012036781757428675
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2 | [
"region:us"
] | 2023-12-09T20:00:47+00:00 | {"pretty_name": "Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2", "dataset_summary": "Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.4-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T19:57:57.872670](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview2/blob/main/results_2023-12-09T19-57-57.872670.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5440235971329553,\n \"acc_stderr\": 0.03410726380039453,\n \"acc_norm\": 0.5490928177495088,\n \"acc_norm_stderr\": 0.03483965758622219,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5379290576758808,\n \"mc2_stderr\": 0.01514579551273296\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.014585305840007105\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5582553276239793,\n \"acc_stderr\": 0.004955798214513426,\n \"acc_norm\": 0.7453694483170683,\n \"acc_norm_stderr\": 0.004347629889040944\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.02727389059430064,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.02727389059430064\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954915,\n \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954915\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.032339434681820885,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.032339434681820885\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7137614678899082,\n \"acc_stderr\": 0.01937943662891999,\n \"acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.01937943662891999\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753088,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753088\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.026613350840261743,\n \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.026613350840261743\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20446927374301677,\n \"acc_stderr\": 0.013488813404711903,\n \"acc_norm\": 0.20446927374301677,\n \"acc_norm_stderr\": 0.013488813404711903\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39374185136897,\n \"acc_stderr\": 0.012478532272564447,\n \"acc_norm\": 0.39374185136897,\n \"acc_norm_stderr\": 0.012478532272564447\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.03000856284500348,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.03000856284500348\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.031157150869355586,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.031157150869355586\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5379290576758808,\n \"mc2_stderr\": 0.01514579551273296\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25701288855193327,\n \"acc_stderr\": 0.012036781757428675\n }\n}\n```", "repo_url": "https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["**/details_harness|winogrande|5_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T19-57-57.872670.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T19_57_57.872670", "path": ["results_2023-12-09T19-57-57.872670.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T19-57-57.872670.parquet"]}]}]} | 2023-12-09T20:01:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.4-preview2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T19:57:57.872670(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluati... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of... |
6e339fb6ca3d695277824b1b4a7e6ac20edc31a3 |
# Dataset Card for Evaluation run of ContextualAI/archangel_sft-kto_llama13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ContextualAI/archangel_sft-kto_llama13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ContextualAI/archangel_sft-kto_llama13b](https://huggingface.co/ContextualAI/archangel_sft-kto_llama13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ContextualAI__archangel_sft-kto_llama13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:01:05.918025](https://huggingface.co/datasets/open-llm-leaderboard/details_ContextualAI__archangel_sft-kto_llama13b/blob/main/results_2023-12-09T20-01-05.918025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4808497396801513,
"acc_stderr": 0.0342816178342491,
"acc_norm": 0.48534799426464065,
"acc_norm_stderr": 0.03504863417527385,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.39418229629364515,
"mc2_stderr": 0.013748123967336172
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.01459093135812017,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.6093407687711612,
"acc_stderr": 0.004869010152280754,
"acc_norm": 0.8080063732324239,
"acc_norm_stderr": 0.003930631369978262
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835361,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835361
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.02264421261552521,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.02264421261552521
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836183,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836183
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.020828148517022582,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.020828148517022582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773403,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773403
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6615581098339719,
"acc_stderr": 0.016920869586210675,
"acc_norm": 0.6615581098339719,
"acc_norm_stderr": 0.016920869586210675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520436,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520436
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.028256660723360173,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.028256660723360173
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.02780749004427619,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.02780749004427619
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37614080834419816,
"acc_stderr": 0.012372214430599814,
"acc_norm": 0.37614080834419816,
"acc_norm_stderr": 0.012372214430599814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.39418229629364515,
"mc2_stderr": 0.013748123967336172
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702311
},
"harness|gsm8k|5": {
"acc": 0.1683093252463988,
"acc_stderr": 0.010305695358125522
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_ContextualAI__archangel_sft-kto_llama13b | [
"region:us"
] | 2023-12-09T20:03:19+00:00 | {"pretty_name": "Evaluation run of ContextualAI/archangel_sft-kto_llama13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ContextualAI/archangel_sft-kto_llama13b](https://huggingface.co/ContextualAI/archangel_sft-kto_llama13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ContextualAI__archangel_sft-kto_llama13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:01:05.918025](https://huggingface.co/datasets/open-llm-leaderboard/details_ContextualAI__archangel_sft-kto_llama13b/blob/main/results_2023-12-09T20-01-05.918025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4808497396801513,\n \"acc_stderr\": 0.0342816178342491,\n \"acc_norm\": 0.48534799426464065,\n \"acc_norm_stderr\": 0.03504863417527385,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.39418229629364515,\n \"mc2_stderr\": 0.013748123967336172\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.01459093135812017,\n \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6093407687711612,\n \"acc_stderr\": 0.004869010152280754,\n \"acc_norm\": 0.8080063732324239,\n \"acc_norm_stderr\": 0.003930631369978262\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.04060127035236395,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.04060127035236395\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835361,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835361\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.02264421261552521,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.02264421261552521\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836183,\n \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836183\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.618348623853211,\n \"acc_stderr\": 0.020828148517022582,\n \"acc_norm\": 0.618348623853211,\n \"acc_norm_stderr\": 0.020828148517022582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.03099866630456052,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03099866630456052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n \"acc_stderr\": 0.03350073248773403,\n \"acc_norm\": 0.5291479820627802,\n \"acc_norm_stderr\": 0.03350073248773403\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6615581098339719,\n \"acc_stderr\": 0.016920869586210675,\n \"acc_norm\": 0.6615581098339719,\n \"acc_norm_stderr\": 0.016920869586210675\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.015201032512520436,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.015201032512520436\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n \"acc_stderr\": 0.028256660723360173,\n \"acc_norm\": 0.5498392282958199,\n \"acc_norm_stderr\": 0.028256660723360173\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.02780749004427619,\n \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.02780749004427619\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37614080834419816,\n \"acc_stderr\": 0.012372214430599814,\n \"acc_norm\": 0.37614080834419816,\n \"acc_norm_stderr\": 0.012372214430599814\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904611,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904611\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.39418229629364515,\n \"mc2_stderr\": 0.013748123967336172\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702311\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1683093252463988,\n \"acc_stderr\": 0.010305695358125522\n }\n}\n```", "repo_url": "https://huggingface.co/ContextualAI/archangel_sft-kto_llama13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-05.918025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["**/details_harness|winogrande|5_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-01-05.918025.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_01_05.918025", "path": ["results_2023-12-09T20-01-05.918025.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-01-05.918025.parquet"]}]}]} | 2023-12-09T20:04:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ContextualAI/archangel_sft-kto_llama13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ContextualAI/archangel_sft-kto_llama13b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:01:05.918025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of ContextualAI/archangel_sft-kto_llama13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ContextualAI/ar... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ContextualAI/archangel_sft-kto_llama13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run... | [
6,
26,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ContextualAI/archangel_sft-kto_llama13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model... |
de3a20f4e5d7e8668614618e1f8cde8e14065648 |
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.5-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:01:18.948310](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview/blob/main/results_2023-12-09T20-01-18.948310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5441057040654342,
"acc_stderr": 0.03404499199717172,
"acc_norm": 0.5501066597591592,
"acc_norm_stderr": 0.034782781683925894,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5516274394366725,
"mc2_stderr": 0.01504190113817455
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985994
},
"harness|hellaswag|10": {
"acc": 0.5624377614021111,
"acc_stderr": 0.004950723480149757,
"acc_norm": 0.7650866361282613,
"acc_norm_stderr": 0.004230782375004432
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296562,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296562
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.0381189098894041,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.0381189098894041
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.030274974880218977,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.030274974880218977
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613674,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613674
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371544,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.02720111766692565,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.02720111766692565
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.394393741851369,
"acc_stderr": 0.012482141665631184,
"acc_norm": 0.394393741851369,
"acc_norm_stderr": 0.012482141665631184
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302954,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302954
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505415,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505415
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.038581589406855174,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.038581589406855174
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5516274394366725,
"mc2_stderr": 0.01504190113817455
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.22744503411675512,
"acc_stderr": 0.011546363312548092
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview | [
"region:us"
] | 2023-12-09T20:04:10+00:00 | {"pretty_name": "Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.5-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:01:18.948310](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview/blob/main/results_2023-12-09T20-01-18.948310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5441057040654342,\n \"acc_stderr\": 0.03404499199717172,\n \"acc_norm\": 0.5501066597591592,\n \"acc_norm_stderr\": 0.034782781683925894,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5516274394366725,\n \"mc2_stderr\": 0.01504190113817455\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985994\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5624377614021111,\n \"acc_stderr\": 0.004950723480149757,\n \"acc_norm\": 0.7650866361282613,\n \"acc_norm_stderr\": 0.004230782375004432\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296562,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296562\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.0381189098894041,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.0381189098894041\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.032449808499900284,\n \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.032449808499900284\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6835443037974683,\n \"acc_stderr\": 0.030274974880218977,\n \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.030274974880218977\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613674,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613674\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371544,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631438,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631438\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.02720111766692565,\n \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.02720111766692565\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.394393741851369,\n \"acc_stderr\": 0.012482141665631184,\n \"acc_norm\": 0.394393741851369,\n \"acc_norm_stderr\": 0.012482141665631184\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302954,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302954\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505415,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505415\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.038581589406855174,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.038581589406855174\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5516274394366725,\n \"mc2_stderr\": 0.01504190113817455\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22744503411675512,\n \"acc_stderr\": 0.011546363312548092\n }\n}\n```", "repo_url": "https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["**/details_harness|winogrande|5_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-01-18.948310.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_01_18.948310", "path": ["results_2023-12-09T20-01-18.948310.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-01-18.948310.parquet"]}]}]} | 2023-12-09T20:04:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.5-preview on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:01:18.948310(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI/... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluatio... | [
6,
29,
31,
178,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of ... |
68d1e81c8b91161637c52f0bbc03896de150707d |
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.5-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:04:44.969487](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview2/blob/main/results_2023-12-09T20-04-44.969487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5160127414618038,
"acc_stderr": 0.034322292179271935,
"acc_norm": 0.5205499288114833,
"acc_norm_stderr": 0.03505429574375341,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405324,
"mc2": 0.5547100211730163,
"mc2_stderr": 0.014944795662781923
},
"harness|arc:challenge|25": {
"acc": 0.48293515358361777,
"acc_stderr": 0.014602878388536593,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076138
},
"harness|hellaswag|10": {
"acc": 0.5521808404700259,
"acc_stderr": 0.004962534264751918,
"acc_norm": 0.7554272057359092,
"acc_norm_stderr": 0.004289551633772026
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5967741935483871,
"acc_stderr": 0.027906150826041136,
"acc_norm": 0.5967741935483871,
"acc_norm_stderr": 0.027906150826041136
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688929,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6972477064220184,
"acc_stderr": 0.01969871143475634,
"acc_norm": 0.6972477064220184,
"acc_norm_stderr": 0.01969871143475634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.032962451101722294,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.032962451101722294
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470237,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470237
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935573,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935573
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6909323116219668,
"acc_stderr": 0.0165249889197022,
"acc_norm": 0.6909323116219668,
"acc_norm_stderr": 0.0165249889197022
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943677,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761959,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759567,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607715,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607715
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34485006518904826,
"acc_stderr": 0.012139881006287052,
"acc_norm": 0.34485006518904826,
"acc_norm_stderr": 0.012139881006287052
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5081699346405228,
"acc_stderr": 0.02022513434305726,
"acc_norm": 0.5081699346405228,
"acc_norm_stderr": 0.02022513434305726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405324,
"mc2": 0.5547100211730163,
"mc2_stderr": 0.014944795662781923
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268738
},
"harness|gsm8k|5": {
"acc": 0.2759666413949962,
"acc_stderr": 0.012312603010427355
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview2 | [
"region:us"
] | 2023-12-09T20:07:36+00:00 | {"pretty_name": "Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2", "dataset_summary": "Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.5-preview2](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:04:44.969487](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.5-preview2/blob/main/results_2023-12-09T20-04-44.969487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5160127414618038,\n \"acc_stderr\": 0.034322292179271935,\n \"acc_norm\": 0.5205499288114833,\n \"acc_norm_stderr\": 0.03505429574375341,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5547100211730163,\n \"mc2_stderr\": 0.014944795662781923\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.48293515358361777,\n \"acc_stderr\": 0.014602878388536593,\n \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076138\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5521808404700259,\n \"acc_stderr\": 0.004962534264751918,\n \"acc_norm\": 0.7554272057359092,\n \"acc_norm_stderr\": 0.004289551633772026\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n \"acc_stderr\": 0.027906150826041136,\n \"acc_norm\": 0.5967741935483871,\n \"acc_norm_stderr\": 0.027906150826041136\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688929,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.032449808499900284,\n \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.032449808499900284\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6972477064220184,\n \"acc_stderr\": 0.01969871143475634,\n \"acc_norm\": 0.6972477064220184,\n \"acc_norm_stderr\": 0.01969871143475634\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.032962451101722294,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.032962451101722294\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470237,\n \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470237\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6909323116219668,\n \"acc_stderr\": 0.0165249889197022,\n \"acc_norm\": 0.6909323116219668,\n \"acc_norm_stderr\": 0.0165249889197022\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943677,\n \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943677\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761959,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761959\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.027982680459759567,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.027982680459759567\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607715,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607715\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34485006518904826,\n \"acc_stderr\": 0.012139881006287052,\n \"acc_norm\": 0.34485006518904826,\n \"acc_norm_stderr\": 0.012139881006287052\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5081699346405228,\n \"acc_stderr\": 0.02022513434305726,\n \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.02022513434305726\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5547100211730163,\n \"mc2_stderr\": 0.014944795662781923\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2759666413949962,\n \"acc_stderr\": 0.012312603010427355\n }\n}\n```", "repo_url": "https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.5-preview2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-04-44.969487.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["**/details_harness|winogrande|5_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-04-44.969487.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_04_44.969487", "path": ["results_2023-12-09T20-04-44.969487.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-04-44.969487.parquet"]}]}]} | 2023-12-09T20:08:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.5-preview2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:04:44.969487(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluati... | [
6,
30,
31,
179,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.5-preview2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of... |
34cdc8108b5c75bb7444ad559e22821ea05454cf |
# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-code-mistral-7b-v2.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:58:45.141578](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v2.0/blob/main/results_2024-01-04T13-58-45.141578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5140785549022919,
"acc_stderr": 0.034312631751023934,
"acc_norm": 0.517073732009842,
"acc_norm_stderr": 0.03502546155916221,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5205221363822641,
"mc2_stderr": 0.01546112612953185
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.014609263165632182,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.01459587320535827
},
"harness|hellaswag|10": {
"acc": 0.569308902609042,
"acc_stderr": 0.004941609820763585,
"acc_norm": 0.7561242780322645,
"acc_norm_stderr": 0.004285410130466108
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874141,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874141
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.03812400565974833,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.03812400565974833
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425075,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425075
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.042639068927951336,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.042639068927951336
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.028040981380761536,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.028040981380761536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.0329229663915514,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.0329229663915514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126174,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126174
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6605504587155964,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.6605504587155964,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470236,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716677,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716677
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.665389527458493,
"acc_stderr": 0.016873468641592157,
"acc_norm": 0.665389527458493,
"acc_norm_stderr": 0.016873468641592157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514998,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514998
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812143,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812143
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.02798268045975956,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.02798268045975956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422704,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.0280459469420424,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.0280459469420424
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3898305084745763,
"acc_stderr": 0.012456386619082604,
"acc_norm": 0.3898305084745763,
"acc_norm_stderr": 0.012456386619082604
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.029674288281311183,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.029674288281311183
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5016339869281046,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.5016339869281046,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5205221363822641,
"mc2_stderr": 0.01546112612953185
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.01270703013996038
},
"harness|gsm8k|5": {
"acc": 0.35633055344958303,
"acc_stderr": 0.01319168503135746
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v2.0 | [
"region:us"
] | 2023-12-09T20:11:15+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-code-mistral-7b-v2.0](https://huggingface.co/uukuguy/speechless-code-mistral-7b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-04T13:58:45.141578](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-7b-v2.0/blob/main/results_2024-01-04T13-58-45.141578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5140785549022919,\n \"acc_stderr\": 0.034312631751023934,\n \"acc_norm\": 0.517073732009842,\n \"acc_norm_stderr\": 0.03502546155916221,\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5205221363822641,\n \"mc2_stderr\": 0.01546112612953185\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632182,\n \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.01459587320535827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.569308902609042,\n \"acc_stderr\": 0.004941609820763585,\n \"acc_norm\": 0.7561242780322645,\n \"acc_norm_stderr\": 0.004285410130466108\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874141,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874141\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425075,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425075\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.042639068927951336,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.042639068927951336\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n \"acc_stderr\": 0.028040981380761536,\n \"acc_norm\": 0.5838709677419355,\n \"acc_norm_stderr\": 0.028040981380761536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.0329229663915514,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.0329229663915514\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126174,\n \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126174\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6605504587155964,\n \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.6605504587155964,\n \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647206,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647206\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470236,\n \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470236\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.665389527458493,\n \"acc_stderr\": 0.016873468641592157,\n \"acc_norm\": 0.665389527458493,\n \"acc_norm_stderr\": 0.016873468641592157\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514998,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514998\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n \"acc_stderr\": 0.014968772435812143,\n \"acc_norm\": 0.2770949720670391,\n \"acc_norm_stderr\": 0.014968772435812143\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.02798268045975956,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.02798268045975956\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422704,\n \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32978723404255317,\n \"acc_stderr\": 0.0280459469420424,\n \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.0280459469420424\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3898305084745763,\n \"acc_stderr\": 0.012456386619082604,\n \"acc_norm\": 0.3898305084745763,\n \"acc_norm_stderr\": 0.012456386619082604\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.029674288281311183,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.029674288281311183\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5016339869281046,\n \"acc_stderr\": 0.020227726838150117,\n \"acc_norm\": 0.5016339869281046,\n \"acc_norm_stderr\": 0.020227726838150117\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5205221363822641,\n \"mc2_stderr\": 0.01546112612953185\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.01270703013996038\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35633055344958303,\n \"acc_stderr\": 0.01319168503135746\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-code-mistral-7b-v2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|arc:challenge|25_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|gsm8k|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hellaswag|10_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-08-24.695971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-04T13-58-45.141578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["**/details_harness|winogrande|5_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["**/details_harness|winogrande|5_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-04T13-58-45.141578.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_08_24.695971", "path": ["results_2023-12-09T20-08-24.695971.parquet"]}, {"split": "2024_01_04T13_58_45.141578", "path": ["results_2024-01-04T13-58-45.141578.parquet"]}, {"split": "latest", "path": ["results_2024-01-04T13-58-45.141578.parquet"]}]}]} | 2024-01-04T14:01:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0
Dataset automatically created during the evaluation run of model uukuguy/speechless-code-mistral-7b-v2.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-04T13:58:45.141578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-code-mistral-7b-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated tas... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-code-mistral-7b-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to... | [
6,
197,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-7b-v2.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-code-mistral-7b-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding... |
b81e9d8be8a84fcb9d610f58ae622ff84c3c9870 |
# Dataset Card for Evaluation run of Locutusque/Orca-2-13B-no_robots
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Locutusque/Orca-2-13B-no_robots
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13B-no_robots](https://huggingface.co/Locutusque/Orca-2-13B-no_robots) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Orca-2-13B-no_robots",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:22:23.375628](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13B-no_robots/blob/main/results_2023-12-09T20-22-23.375628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6001251554964193,
"acc_stderr": 0.032923087084061796,
"acc_norm": 0.605904200867543,
"acc_norm_stderr": 0.03362523660489667,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5117394616239719,
"mc2_stderr": 0.014749980218549294
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.014497573881108288,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427006
},
"harness|hellaswag|10": {
"acc": 0.6075482971519618,
"acc_stderr": 0.004872984492967997,
"acc_norm": 0.7956582354112727,
"acc_norm_stderr": 0.004023957334461984
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729654,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729654
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503217,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503217
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489274,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489274
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.01474312539482329,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.01474312539482329
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894635,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547235,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623354,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5117394616239719,
"mc2_stderr": 0.014749980218549294
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569576
},
"harness|gsm8k|5": {
"acc": 0.27293404094010615,
"acc_stderr": 0.012270381151108754
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Locutusque__Orca-2-13B-no_robots | [
"region:us"
] | 2023-12-09T20:25:19+00:00 | {"pretty_name": "Evaluation run of Locutusque/Orca-2-13B-no_robots", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13B-no_robots](https://huggingface.co/Locutusque/Orca-2-13B-no_robots) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Orca-2-13B-no_robots\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:22:23.375628](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13B-no_robots/blob/main/results_2023-12-09T20-22-23.375628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6001251554964193,\n \"acc_stderr\": 0.032923087084061796,\n \"acc_norm\": 0.605904200867543,\n \"acc_norm_stderr\": 0.03362523660489667,\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5117394616239719,\n \"mc2_stderr\": 0.014749980218549294\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.014497573881108288,\n \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427006\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6075482971519618,\n \"acc_stderr\": 0.004872984492967997,\n \"acc_norm\": 0.7956582354112727,\n \"acc_norm_stderr\": 0.004023957334461984\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729654,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729654\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503217,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503217\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489274,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489274\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.01474312539482329,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.01474312539482329\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n \"acc_stderr\": 0.015491756531894635,\n \"acc_norm\": 0.311731843575419,\n \"acc_norm_stderr\": 0.015491756531894635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.02508947852376513,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.02508947852376513\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n \"acc_stderr\": 0.012647695889547235,\n \"acc_norm\": 0.43089960886571055,\n \"acc_norm_stderr\": 0.012647695889547235\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623354,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623354\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5117394616239719,\n \"mc2_stderr\": 0.014749980218549294\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569576\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27293404094010615,\n \"acc_stderr\": 0.012270381151108754\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/Orca-2-13B-no_robots", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-22-23.375628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["**/details_harness|winogrande|5_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-22-23.375628.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_22_23.375628", "path": ["results_2023-12-09T20-22-23.375628.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-22-23.375628.parquet"]}]}]} | 2023-12-09T20:26:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/Orca-2-13B-no_robots
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Locutusque/Orca-2-13B-no_robots on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:22:23.375628(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Locutusque/Orca-2-13B-no_robots",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/Orca-2-13B-n... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/Orca-2-13B-no_robots",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Locutusque/Orca-2-13B-no_robots## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutus... |
43c9a98b357e57b323fba8f482033331749b6acd |
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.4-preview3](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:24:42.121892](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview3/blob/main/results_2023-12-09T20-24-42.121892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5277560649597496,
"acc_stderr": 0.03425240844552933,
"acc_norm": 0.532688176887894,
"acc_norm_stderr": 0.034990875171934714,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5235126569364149,
"mc2_stderr": 0.015157264857162787
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.014609263165632182,
"acc_norm": 0.5127986348122867,
"acc_norm_stderr": 0.014606603181012541
},
"harness|hellaswag|10": {
"acc": 0.5513841864170484,
"acc_stderr": 0.00496336208527556,
"acc_norm": 0.7482573192591118,
"acc_norm_stderr": 0.004331271717773856
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.037694303145125674,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.037694303145125674
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871616,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871616
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03068582059661079,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03068582059661079
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906276,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906276
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084032,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084032
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.569364161849711,
"acc_stderr": 0.026658800273672376,
"acc_norm": 0.569364161849711,
"acc_norm_stderr": 0.026658800273672376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.01563844038024149,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.01563844038024149
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.027809322585774503,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.027809322585774503
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.02780749004427619,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.02780749004427619
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3650586701434159,
"acc_stderr": 0.012296373743443478,
"acc_norm": 0.3650586701434159,
"acc_norm_stderr": 0.012296373743443478
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.020196594933541194,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.020196594933541194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827423,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827423
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5235126569364149,
"mc2_stderr": 0.015157264857162787
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
},
"harness|gsm8k|5": {
"acc": 0.24791508718726307,
"acc_stderr": 0.011893980214826171
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview3 | [
"region:us"
] | 2023-12-09T20:27:33+00:00 | {"pretty_name": "Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3", "dataset_summary": "Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.4-preview3](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:24:42.121892](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.4-preview3/blob/main/results_2023-12-09T20-24-42.121892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5277560649597496,\n \"acc_stderr\": 0.03425240844552933,\n \"acc_norm\": 0.532688176887894,\n \"acc_norm_stderr\": 0.034990875171934714,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5235126569364149,\n \"mc2_stderr\": 0.015157264857162787\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632182,\n \"acc_norm\": 0.5127986348122867,\n \"acc_norm_stderr\": 0.014606603181012541\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5513841864170484,\n \"acc_stderr\": 0.00496336208527556,\n \"acc_norm\": 0.7482573192591118,\n \"acc_norm_stderr\": 0.004331271717773856\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.03422398565657551,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.03422398565657551\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.037694303145125674,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.037694303145125674\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871616,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871616\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03068582059661079,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03068582059661079\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n \"acc_stderr\": 0.015959829933084032,\n \"acc_norm\": 0.7254150702426565,\n \"acc_norm_stderr\": 0.015959829933084032\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.569364161849711,\n \"acc_stderr\": 0.026658800273672376,\n \"acc_norm\": 0.569364161849711,\n \"acc_norm_stderr\": 0.026658800273672376\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n \"acc_stderr\": 0.01563844038024149,\n \"acc_norm\": 0.3229050279329609,\n \"acc_norm_stderr\": 0.01563844038024149\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.027809322585774503,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.027809322585774503\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.02780749004427619,\n \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.02780749004427619\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3650586701434159,\n \"acc_stderr\": 0.012296373743443478,\n \"acc_norm\": 0.3650586701434159,\n \"acc_norm_stderr\": 0.012296373743443478\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.020196594933541194,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.020196594933541194\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827423,\n \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827423\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5235126569364149,\n \"mc2_stderr\": 0.015157264857162787\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24791508718726307,\n \"acc_stderr\": 0.011893980214826171\n }\n}\n```", "repo_url": "https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.4-preview3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-24-42.121892.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["**/details_harness|winogrande|5_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-24-42.121892.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_24_42.121892", "path": ["results_2023-12-09T20-24-42.121892.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-24-42.121892.parquet"]}]}]} | 2023-12-09T20:28:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.4-preview3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:24:42.121892(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluati... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.4-preview3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of... |
35001355c7883c5e9d6a1c12669c090621a06327 | # Dataset Card for "rapidapi-example-responses-tokenized-bart"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-tokenized-bart | [
"region:us"
] | 2023-12-09T20:28:57+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 167674923.4914025, "num_examples": 45170}, {"name": "test", "num_bytes": 18630959.5085975, "num_examples": 5019}], "download_size": 65550667, "dataset_size": 186305883.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-10T00:01:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-tokenized-bart"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-tokenized-bart\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-tokenized-bart\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-tokenized-bart\"\n\nMore Information needed"
] |
3667e6867cd69095884caa6277e0833c052ea6d6 |
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Intel/neural-chat-7b-v3-3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-3](https://huggingface.co/Intel/neural-chat-7b-v3-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:33:34.862293](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3/blob/main/results_2023-12-09T20-33-34.862293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.633718840288445,
"acc_stderr": 0.03262856399270551,
"acc_norm": 0.6351165946232198,
"acc_norm_stderr": 0.03329008839330021,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697534,
"mc2": 0.6301479198844473,
"mc2_stderr": 0.015176409746133967
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955007,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817837
},
"harness|hellaswag|10": {
"acc": 0.6617207727544314,
"acc_stderr": 0.004721571443354415,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.0420392104015628,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.0420392104015628
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958546,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958546
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.01266341210124834,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.01266341210124834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697534,
"mc2": 0.6301479198844473,
"mc2_stderr": 0.015176409746133967
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626913
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274231
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3 | [
"region:us"
] | 2023-12-09T20:36:24+00:00 | {"pretty_name": "Evaluation run of Intel/neural-chat-7b-v3-3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-3](https://huggingface.co/Intel/neural-chat-7b-v3-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:33:34.862293](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3/blob/main/results_2023-12-09T20-33-34.862293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.633718840288445,\n \"acc_stderr\": 0.03262856399270551,\n \"acc_norm\": 0.6351165946232198,\n \"acc_norm_stderr\": 0.03329008839330021,\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697534,\n \"mc2\": 0.6301479198844473,\n \"mc2_stderr\": 0.015176409746133967\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955007,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817837\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6617207727544314,\n \"acc_stderr\": 0.004721571443354415,\n \"acc_norm\": 0.8526190001991635,\n \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.0420392104015628,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.0420392104015628\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958546,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958546\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623996,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n \"acc_stderr\": 0.01266341210124834,\n \"acc_norm\": 0.43546284224250326,\n \"acc_norm_stderr\": 0.01266341210124834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697534,\n \"mc2\": 0.6301479198844473,\n \"mc2_stderr\": 0.015176409746133967\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626913\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.013428382481274231\n }\n}\n```", "repo_url": "https://huggingface.co/Intel/neural-chat-7b-v3-3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["**/details_harness|winogrande|5_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-33-34.862293.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_33_34.862293", "path": ["results_2023-12-09T20-33-34.862293.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-33-34.862293.parquet"]}]}]} | 2023-12-09T20:37:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:33:34.862293(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-3 on ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Inte... | [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Intel/neural-... |
fdf62b3bc22e4a052afe26888152dff144acfa42 |
# Dataset Card for Evaluation run of CausalLM/72B-preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CausalLM/72B-preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CausalLM/72B-preview](https://huggingface.co/CausalLM/72B-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CausalLM__72B-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T21:42:26.382618](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__72B-preview/blob/main/results_2023-12-09T21-42-26.382618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7667362936260237,
"acc_stderr": 0.027929321227362417,
"acc_norm": 0.7704368351697709,
"acc_norm_stderr": 0.028461947646281283,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5257567284522894,
"mc2_stderr": 0.014743557767765337
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693024,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179347
},
"harness|hellaswag|10": {
"acc": 0.6468830910177256,
"acc_stderr": 0.004769618829196502,
"acc_norm": 0.8323043218482374,
"acc_norm_stderr": 0.0037283229688748914
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9144736842105263,
"acc_stderr": 0.02275867713088861,
"acc_norm": 0.9144736842105263,
"acc_norm_stderr": 0.02275867713088861
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.023108393799841326,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.023108393799841326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8,
"acc_stderr": 0.026148818018424502,
"acc_norm": 0.8,
"acc_norm_stderr": 0.026148818018424502
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6798941798941799,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.6798941798941799,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.017776778700485173,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.017776778700485173
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.0270459488258654,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.0270459488258654
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.0163199507007674,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.0163199507007674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637282,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5296296296296297,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.5296296296296297,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8319327731092437,
"acc_stderr": 0.024289102115692275,
"acc_norm": 0.8319327731092437,
"acc_norm_stderr": 0.024289102115692275
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.543046357615894,
"acc_stderr": 0.040673251742474416,
"acc_norm": 0.543046357615894,
"acc_norm_stderr": 0.040673251742474416
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.011051255247815481,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.011051255247815481
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.01886951464665892,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.01886951464665892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758535,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758535
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6785714285714286,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.6785714285714286,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253878,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253878
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141866,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.846820809248555,
"acc_stderr": 0.019390370108969934,
"acc_norm": 0.846820809248555,
"acc_norm_stderr": 0.019390370108969934
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5642458100558659,
"acc_stderr": 0.016583881958602397,
"acc_norm": 0.5642458100558659,
"acc_norm_stderr": 0.016583881958602397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043714,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043714
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.02051905034208471,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.02051905034208471
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6258148631029987,
"acc_stderr": 0.012359335618172063,
"acc_norm": 0.6258148631029987,
"acc_norm_stderr": 0.012359335618172063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.02296606758558181,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.02296606758558181
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.0258012834750905,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.0258012834750905
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5257567284522894,
"mc2_stderr": 0.014743557767765337
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706167
},
"harness|gsm8k|5": {
"acc": 0.7210007581501138,
"acc_stderr": 0.012354115779970311
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_CausalLM__72B-preview | [
"region:us"
] | 2023-12-09T20:40:27+00:00 | {"pretty_name": "Evaluation run of CausalLM/72B-preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [CausalLM/72B-preview](https://huggingface.co/CausalLM/72B-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CausalLM__72B-preview\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T21:42:26.382618](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__72B-preview/blob/main/results_2023-12-09T21-42-26.382618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7667362936260237,\n \"acc_stderr\": 0.027929321227362417,\n \"acc_norm\": 0.7704368351697709,\n \"acc_norm_stderr\": 0.028461947646281283,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5257567284522894,\n \"mc2_stderr\": 0.014743557767765337\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693024,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179347\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6468830910177256,\n \"acc_stderr\": 0.004769618829196502,\n \"acc_norm\": 0.8323043218482374,\n \"acc_norm_stderr\": 0.0037283229688748914\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9144736842105263,\n \"acc_stderr\": 0.02275867713088861,\n \"acc_norm\": 0.9144736842105263,\n \"acc_norm_stderr\": 0.02275867713088861\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.023108393799841326,\n \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.023108393799841326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424502,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424502\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6798941798941799,\n \"acc_stderr\": 0.024026846392873506,\n \"acc_norm\": 0.6798941798941799,\n \"acc_norm_stderr\": 0.024026846392873506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n \"acc_stderr\": 0.017776778700485173,\n \"acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.017776778700485173\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.0270459488258654,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.0270459488258654\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.0163199507007674,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.0163199507007674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637282,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637282\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.5296296296296297,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.5296296296296297,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8319327731092437,\n \"acc_stderr\": 0.024289102115692275,\n \"acc_norm\": 0.8319327731092437,\n \"acc_norm_stderr\": 0.024289102115692275\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.543046357615894,\n \"acc_stderr\": 0.040673251742474416,\n \"acc_norm\": 0.543046357615894,\n \"acc_norm_stderr\": 0.040673251742474416\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815481,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815481\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.01886951464665892,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.01886951464665892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758535,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758535\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6785714285714286,\n \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.6785714285714286,\n \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253878,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253878\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.846820809248555,\n \"acc_stderr\": 0.019390370108969934,\n \"acc_norm\": 0.846820809248555,\n \"acc_norm_stderr\": 0.019390370108969934\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5642458100558659,\n \"acc_stderr\": 0.016583881958602397,\n \"acc_norm\": 0.5642458100558659,\n \"acc_norm_stderr\": 0.016583881958602397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043714,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043714\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n \"acc_stderr\": 0.02051905034208471,\n \"acc_norm\": 0.8456591639871383,\n \"acc_norm_stderr\": 0.02051905034208471\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6276595744680851,\n \"acc_stderr\": 0.028838921471251455,\n \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.028838921471251455\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6258148631029987,\n \"acc_stderr\": 0.012359335618172063,\n \"acc_norm\": 0.6258148631029987,\n \"acc_norm_stderr\": 0.012359335618172063\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.02296606758558181,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.02296606758558181\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.0258012834750905,\n \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.0258012834750905\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5257567284522894,\n \"mc2_stderr\": 0.014743557767765337\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706167\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7210007581501138,\n \"acc_stderr\": 0.012354115779970311\n }\n}\n```", "repo_url": "https://huggingface.co/CausalLM/72B-preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|arc:challenge|25_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|gsm8k|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hellaswag|10_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-37-44.242475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T21-42-26.382618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["**/details_harness|winogrande|5_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["**/details_harness|winogrande|5_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T21-42-26.382618.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_37_44.242475", "path": ["results_2023-12-09T20-37-44.242475.parquet"]}, {"split": "2023_12_09T21_42_26.382618", "path": ["results_2023-12-09T21-42-26.382618.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T21-42-26.382618.parquet"]}]}]} | 2023-12-09T21:45:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CausalLM/72B-preview
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CausalLM/72B-preview on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T21:42:26.382618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of CausalLM/72B-preview",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CausalLM/72B-preview on the Open L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CausalLM/72B-preview",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CausalLM/... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CausalLM/72B-preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CausalLM/72B-previ... |
9fecc6de4c1592b470c542489fcc561280d2f462 |
# Dataset Card for Evaluation run of perlthoughts/Falkor-16b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Falkor-16b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-16b](https://huggingface.co/perlthoughts/Falkor-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Falkor-16b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:44:01.806324](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-16b/blob/main/results_2023-12-09T20-44-01.806324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6322464801756708,
"acc_stderr": 0.032618125802324496,
"acc_norm": 0.6394191887381151,
"acc_norm_stderr": 0.03329436215245147,
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.627668658731456,
"mc2_stderr": 0.015393187257856768
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168482,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892973
},
"harness|hellaswag|10": {
"acc": 0.6330412268472416,
"acc_stderr": 0.0048099011512348355,
"acc_norm": 0.826229834694284,
"acc_norm_stderr": 0.00378137335887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.02570264026060374,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.02570264026060374
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.01270231749055981,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.01270231749055981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.627668658731456,
"mc2_stderr": 0.015393187257856768
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643417
},
"harness|gsm8k|5": {
"acc": 0.28278999241849884,
"acc_stderr": 0.012405020417873619
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_perlthoughts__Falkor-16b | [
"region:us"
] | 2023-12-09T20:46:54+00:00 | {"pretty_name": "Evaluation run of perlthoughts/Falkor-16b", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Falkor-16b](https://huggingface.co/perlthoughts/Falkor-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Falkor-16b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T20:44:01.806324](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-16b/blob/main/results_2023-12-09T20-44-01.806324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6322464801756708,\n \"acc_stderr\": 0.032618125802324496,\n \"acc_norm\": 0.6394191887381151,\n \"acc_norm_stderr\": 0.03329436215245147,\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.627668658731456,\n \"mc2_stderr\": 0.015393187257856768\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168482,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6330412268472416,\n \"acc_stderr\": 0.0048099011512348355,\n \"acc_norm\": 0.826229834694284,\n \"acc_norm_stderr\": 0.00378137335887\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n \"acc_stderr\": 0.01270231749055981,\n \"acc_norm\": 0.4485006518904824,\n \"acc_norm_stderr\": 0.01270231749055981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.627668658731456,\n \"mc2_stderr\": 0.015393187257856768\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643417\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28278999241849884,\n \"acc_stderr\": 0.012405020417873619\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Falkor-16b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["**/details_harness|winogrande|5_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T20-44-01.806324.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T20_44_01.806324", "path": ["results_2023-12-09T20-44-01.806324.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T20-44-01.806324.parquet"]}]}]} | 2023-12-09T20:47:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of perlthoughts/Falkor-16b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model perlthoughts/Falkor-16b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T20:44:01.806324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of perlthoughts/Falkor-16b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Falkor-16b on the ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of perlthoughts/Falkor-16b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlth... | [
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of perlthoughts/Falkor-16b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Fa... |
4577fb5335874140cde1953076ad08b526f51dfb | 139 Code Conversations generated from LeetCode Questions including official answers.
Totals to 968 total messages from USER and SYSTEM.
Includes:
- Generation of the solutions
- Conversions into other programming languages
- Adjustments to the Code
- Generating tests
Conversations generated with GPT4/GPT4-Turbo
| SebastianBodza/LeetCode_Conversations | [
"region:us"
] | 2023-12-09T21:03:33+00:00 | {} | 2023-12-09T21:09:56+00:00 | [] | [] | TAGS
#region-us
| 139 Code Conversations generated from LeetCode Questions including official answers.
Totals to 968 total messages from USER and SYSTEM.
Includes:
- Generation of the solutions
- Conversions into other programming languages
- Adjustments to the Code
- Generating tests
Conversations generated with GPT4/GPT4-Turbo
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
d4ca8eaee10a2568c1afd74eef0755c104f930ad | # Dataset Card for "mtdg-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nayohan/mtdg-eval | [
"region:us"
] | 2023-12-09T21:32:47+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 60809, "num_examples": 100}], "download_size": 39291, "dataset_size": 60809}} | 2023-12-09T22:27:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mtdg-eval"
More Information needed | [
"# Dataset Card for \"mtdg-eval\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mtdg-eval\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mtdg-eval\"\n\nMore Information needed"
] |
f0b94c91184646d6c171d7180711aedf65039690 | # Dataset Card for "msdg-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nayohan/msdg-eval | [
"region:us"
] | 2023-12-09T21:32:50+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 406016, "num_examples": 100}], "download_size": 218562, "dataset_size": 406016}} | 2023-12-09T22:27:08+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "msdg-eval"
More Information needed | [
"# Dataset Card for \"msdg-eval\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"msdg-eval\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"msdg-eval\"\n\nMore Information needed"
] |
2d7c3c2da4a465ee0d659e65a94f03ce4174f34f |
# Dataset Card for Evaluation run of AA051610/AZG
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/AZG
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/AZG](https://huggingface.co/AA051610/AZG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__AZG",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T22:11:18.691101](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__AZG/blob/main/results_2023-12-09T22-11-18.691101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6997224830510826,
"acc_stderr": 0.03037702672483155,
"acc_norm": 0.7036495225534056,
"acc_norm_stderr": 0.030966589072480434,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.538427969031974,
"mc2_stderr": 0.015499026242399048
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578274,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142822
},
"harness|hellaswag|10": {
"acc": 0.619398526190002,
"acc_stderr": 0.0048454245247640405,
"acc_norm": 0.8201553475403306,
"acc_norm_stderr": 0.003832731017592104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741706,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741706
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5423280423280423,
"acc_stderr": 0.02565886886205833,
"acc_norm": 0.5423280423280423,
"acc_norm_stderr": 0.02565886886205833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7538461538461538,
"acc_stderr": 0.021840866990423084,
"acc_norm": 0.7538461538461538,
"acc_norm_stderr": 0.021840866990423084
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.026064313406304527,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.026064313406304527
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232808,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.022613286601132012,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.022613286601132012
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.032472243899179465,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.032472243899179465
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580664,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580664
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941632,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941632
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041263,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041263
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.023929155517351298,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.023929155517351298
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398188,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398188
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445796,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5286831812255541,
"acc_stderr": 0.012749206007657459,
"acc_norm": 0.5286831812255541,
"acc_norm_stderr": 0.012749206007657459
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545436,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545436
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.538427969031974,
"mc2_stderr": 0.015499026242399048
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205069
},
"harness|gsm8k|5": {
"acc": 0.599696739954511,
"acc_stderr": 0.013495926436566441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_AA051610__AZG | [
"region:us"
] | 2023-12-09T22:14:06+00:00 | {"pretty_name": "Evaluation run of AA051610/AZG", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/AZG](https://huggingface.co/AA051610/AZG) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__AZG\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-09T22:11:18.691101](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__AZG/blob/main/results_2023-12-09T22-11-18.691101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6997224830510826,\n \"acc_stderr\": 0.03037702672483155,\n \"acc_norm\": 0.7036495225534056,\n \"acc_norm_stderr\": 0.030966589072480434,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.538427969031974,\n \"mc2_stderr\": 0.015499026242399048\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578274,\n \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142822\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.619398526190002,\n \"acc_stderr\": 0.0048454245247640405,\n \"acc_norm\": 0.8201553475403306,\n \"acc_norm_stderr\": 0.003832731017592104\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741706,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741706\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438015,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438015\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5423280423280423,\n \"acc_stderr\": 0.02565886886205833,\n \"acc_norm\": 0.5423280423280423,\n \"acc_norm_stderr\": 0.02565886886205833\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.03486731727419872,\n \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.03486731727419872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7538461538461538,\n \"acc_stderr\": 0.021840866990423084,\n \"acc_norm\": 0.7538461538461538,\n \"acc_norm_stderr\": 0.021840866990423084\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.026064313406304527,\n \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.026064313406304527\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232808,\n \"acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232808\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8823529411764706,\n \"acc_stderr\": 0.022613286601132012,\n \"acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.022613286601132012\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.032472243899179465,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.032472243899179465\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580664,\n \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580664\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.018724301741941632,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.018724301741941632\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n \"acc_stderr\": 0.011622736692041263,\n \"acc_norm\": 0.879948914431673,\n \"acc_norm_stderr\": 0.011622736692041263\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.023929155517351298,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.023929155517351298\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.023839303311398188,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.023839303311398188\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445796,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445796\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5286831812255541,\n \"acc_stderr\": 0.012749206007657459,\n \"acc_norm\": 0.5286831812255541,\n \"acc_norm_stderr\": 0.012749206007657459\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545436,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545436\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.538427969031974,\n \"mc2_stderr\": 0.015499026242399048\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205069\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.599696739954511,\n \"acc_stderr\": 0.013495926436566441\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/AZG", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|arc:challenge|25_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|gsm8k|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hellaswag|10_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-09T22-11-18.691101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["**/details_harness|winogrande|5_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-09T22-11-18.691101.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_09T22_11_18.691101", "path": ["results_2023-12-09T22-11-18.691101.parquet"]}, {"split": "latest", "path": ["results_2023-12-09T22-11-18.691101.parquet"]}]}]} | 2023-12-09T22:14:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051610/AZG
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AA051610/AZG on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-09T22:11:18.691101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of AA051610/AZG",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AA051610/AZG on the Open LLM Leaderboard.\... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051610/AZG",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AA051610/AZG on t... | [
6,
16,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AA051610/AZG## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AA051610/AZG on the Open L... |
40dcff541f1dc1cbab71bfedd248843ebd97ed6d | # Dataset Card for "idrid_grading"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | marcelittle/idrid_grading | [
"region:us"
] | 2023-12-09T23:02:03+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19595295.0, "num_examples": 257}], "download_size": 19434641, "dataset_size": 19595295.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-09T23:02:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "idrid_grading"
More Information needed | [
"# Dataset Card for \"idrid_grading\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"idrid_grading\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"idrid_grading\"\n\nMore Information needed"
] |
97b27f3acfb9fa683bdf0c3a344d665e43847e04 | # Dataset Card for "rapidapi-example-responses-tokenized-xlm-roberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-tokenized-xlm-roberta | [
"region:us"
] | 2023-12-09T23:58:32+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 166566222.06213877, "num_examples": 43755}, {"name": "test", "num_bytes": 18508626.93786124, "num_examples": 4862}], "download_size": 62641988, "dataset_size": 185074849.0}} | 2023-12-09T23:58:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-tokenized-xlm-roberta"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-tokenized-xlm-roberta\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-tokenized-xlm-roberta\"\n\nMore Information needed"
] | [
6,
30
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-tokenized-xlm-roberta\"\n\nMore Information needed"
] |
557bc2c1ff6e04709ce61ced1a448b861c861821 |
# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NeverSleep/Noromaid-7b-v0.1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7b-v0.1.1](https://huggingface.co/NeverSleep/Noromaid-7b-v0.1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T00:08:08.403687](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.1.1/blob/main/results_2023-12-10T00-08-08.403687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6317982878988913,
"acc_stderr": 0.032611423846025014,
"acc_norm": 0.6377453054345599,
"acc_norm_stderr": 0.033270865682523715,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4429984716658762,
"mc2_stderr": 0.014505119561026104
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.6429994025094603,
"acc_stderr": 0.004781358113341955,
"acc_norm": 0.842760406293567,
"acc_norm_stderr": 0.003632825479128595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739155,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748927,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331152,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331152
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816653,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816653
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097062,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097062
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675596,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4429984716658762,
"mc2_stderr": 0.014505119561026104
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.3684609552691433,
"acc_stderr": 0.013287342651674569
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.1.1 | [
"region:us"
] | 2023-12-10T00:10:59+00:00 | {"pretty_name": "Evaluation run of NeverSleep/Noromaid-7b-v0.1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7b-v0.1.1](https://huggingface.co/NeverSleep/Noromaid-7b-v0.1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T00:08:08.403687](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.1.1/blob/main/results_2023-12-10T00-08-08.403687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6317982878988913,\n \"acc_stderr\": 0.032611423846025014,\n \"acc_norm\": 0.6377453054345599,\n \"acc_norm_stderr\": 0.033270865682523715,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4429984716658762,\n \"mc2_stderr\": 0.014505119561026104\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6429994025094603,\n \"acc_stderr\": 0.004781358113341955,\n \"acc_norm\": 0.842760406293567,\n \"acc_norm_stderr\": 0.003632825479128595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739155,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748927,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748927\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331152,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331152\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816653,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816653\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097062,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097062\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675596,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4429984716658762,\n \"mc2_stderr\": 0.014505119561026104\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3684609552691433,\n \"acc_stderr\": 0.013287342651674569\n }\n}\n```", "repo_url": "https://huggingface.co/NeverSleep/Noromaid-7b-v0.1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|arc:challenge|25_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|gsm8k|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hellaswag|10_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T00-08-08.403687.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["**/details_harness|winogrande|5_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T00-08-08.403687.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T00_08_08.403687", "path": ["results_2023-12-10T00-08-08.403687.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T00-08-08.403687.parquet"]}]}]} | 2023-12-10T00:11:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.1.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0.1.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T00:08:08.403687(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NeverSlee... |
9fcf571ea07cde4e5dd34b8bdae97d3033994b93 |
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-67b-v15-base](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T00:18:57.450795](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base/blob/main/results_2023-12-10T00-18-57.450795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7077078977933982,
"acc_stderr": 0.030015760444243065,
"acc_norm": 0.7114942838020437,
"acc_norm_stderr": 0.030600759357365358,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.016874805001453178,
"mc2": 0.5230963516759597,
"mc2_stderr": 0.014845955802002899
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6751643098984266,
"acc_stderr": 0.004673563250946104,
"acc_norm": 0.8602867954590719,
"acc_norm_stderr": 0.003459806991389836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291762,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291762
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632159,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424208,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.029869605095316908,
"acc_norm": 0.4,
"acc_norm_stderr": 0.029869605095316908
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660076,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.012809780081878927,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.012809780081878927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580662,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.010770472014886722,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.010770472014886722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.02228963885261789,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.02228963885261789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210253996,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210253996
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8549382716049383,
"acc_stderr": 0.019594877019727956,
"acc_norm": 0.8549382716049383,
"acc_norm_stderr": 0.019594877019727956
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5632333767926988,
"acc_stderr": 0.012667701919603657,
"acc_norm": 0.5632333767926988,
"acc_norm_stderr": 0.012667701919603657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.016774672365468504,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.016774672365468504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.016874805001453178,
"mc2": 0.5230963516759597,
"mc2_stderr": 0.014845955802002899
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222782
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511575
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base | [
"region:us"
] | 2023-12-10T00:21:39+00:00 | {"pretty_name": "Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-67b-v15-base](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T00:18:57.450795](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v15-base/blob/main/results_2023-12-10T00-18-57.450795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7077078977933982,\n \"acc_stderr\": 0.030015760444243065,\n \"acc_norm\": 0.7114942838020437,\n \"acc_norm_stderr\": 0.030600759357365358,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.016874805001453178,\n \"mc2\": 0.5230963516759597,\n \"mc2_stderr\": 0.014845955802002899\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6751643098984266,\n \"acc_stderr\": 0.004673563250946104,\n \"acc_norm\": 0.8602867954590719,\n \"acc_norm_stderr\": 0.003459806991389836\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.029514245964291762,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.029514245964291762\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378949,\n \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378949\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.025699352832131792,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.025699352832131792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.02141724293632159,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.02141724293632159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424208,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424208\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.02306043838085774,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.02306043838085774\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.029869605095316908,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.029869605095316908\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660076,\n \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.012809780081878927,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.012809780081878927\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580662,\n \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580662\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n \"acc_stderr\": 0.010770472014886722,\n \"acc_norm\": 0.8991060025542784,\n \"acc_norm_stderr\": 0.010770472014886722\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.02228963885261789,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.02228963885261789\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n \"acc_stderr\": 0.016476342210253996,\n \"acc_norm\": 0.4145251396648045,\n \"acc_norm_stderr\": 0.016476342210253996\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7908496732026143,\n \"acc_stderr\": 0.023287685312334806,\n \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.023287685312334806\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.019594877019727956,\n \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.019594877019727956\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5632333767926988,\n \"acc_stderr\": 0.012667701919603657,\n \"acc_norm\": 0.5632333767926988,\n \"acc_norm_stderr\": 0.012667701919603657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.016774672365468504,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.016774672365468504\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.016874805001453178,\n \"mc2\": 0.5230963516759597,\n \"mc2_stderr\": 0.014845955802002899\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222782\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \"acc_stderr\": 0.013642195352511575\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v15-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|arc:challenge|25_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|gsm8k|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hellaswag|10_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["**/details_harness|winogrande|5_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T00-18-57.450795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T00_18_57.450795", "path": ["results_2023-12-10T00-18-57.450795.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T00-18-57.450795.parquet"]}]}]} | 2023-12-10T00:22:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-deepseek-67b-v15-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T00:18:57.450795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/ope... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation r... | [
6,
28,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v15-base## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... |
e398dc168a6e5932e5071ed1980cbc06314d4223 | # Dataset Card for "cai-conversation-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/cai-conversation-dev | [
"region:us"
] | 2023-12-10T00:50:16+00:00 | {"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "prompt", "dtype": "string"}, {"name": "init_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "init_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "critic_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "critic_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "revision_prompt", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "revision_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 9128, "num_examples": 4}, {"name": "train_prefs", "num_bytes": 10733, "num_examples": 4}, {"name": "test_sft", "num_bytes": 15069, "num_examples": 4}, {"name": "test_prefs", "num_bytes": 11987, "num_examples": 4}], "download_size": 126881, "dataset_size": 46917}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]} | 2024-01-09T19:30:16+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cai-conversation-dev"
More Information needed | [
"# Dataset Card for \"cai-conversation-dev\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cai-conversation-dev\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"cai-conversation-dev\"\n\nMore Information needed"
] |
d7606a863f20fc226b727e73f8f1a70ae29e598a |
本数据集是为了部分不适合直接显示的角色进行hugging face存储。text部分做了简单的编码加密
使用方法
载入函数
```python
from transformers import AutoTokenizer, AutoModel, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("silk-road/Chat-Haruhi_qwen_1_8", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("silk-road/Chat-Haruhi_qwen_1_8", trust_remote_code=True).half().cuda()
model = model.eval()
```
具体看https://github.com/LC1332/Chat-Haruhi-Suzumiya/blob/main/notebook/ChatHaruhi_x_Qwen1_8B.ipynb 这个notebook
```python
from ChatHaruhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'silk-road/ChatHaruhi-Waifu/女贤者', max_len_story = 1000 )
prompt = chatbot.generate_prompt(role='男子', text = '你已经不能动了')
response, _ = model.chat(tokenizer, prompt, history=[])
print(response)
chatbot.append_response(response)
#模型输出:
#女贤者:「啊啊啊,不可以!」
```
项目链接https://github.com/LC1332/Chat-Haruhi-Suzumiya
欢迎提供新的语料 | silk-road/ChatHaruhi-Waifu | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:zh",
"license:cc-by-4.0",
"region:us"
] | 2023-12-10T01:00:10+00:00 | {"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["text-generation"]} | 2023-12-10T01:25:05+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-Chinese #license-cc-by-4.0 #region-us
|
本数据集是为了部分不适合直接显示的角色进行hugging face存储。text部分做了简单的编码加密
使用方法
载入函数
具体看https://URL 这个notebook
项目链接https://URL
欢迎提供新的语料 | [] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-Chinese #license-cc-by-4.0 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-n<1K #language-Chinese #license-cc-by-4.0 #region-us \n"
] |
ce65b53c0fd657004586e9a23ae1128069c01d44 | # Dataset Card for "summarize_from_feedback_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | chargoddard/summarize_from_feedback_alpaca | [
"region:us"
] | 2023-12-10T01:32:22+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 138986664, "num_examples": 92858}], "download_size": 16466576, "dataset_size": 138986664}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T01:32:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summarize_from_feedback_alpaca"
More Information needed | [
"# Dataset Card for \"summarize_from_feedback_alpaca\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summarize_from_feedback_alpaca\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summarize_from_feedback_alpaca\"\n\nMore Information needed"
] |
5e65876b465126ff5fe0ffd5cacbb8d24d9fc81e |
# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T01:51:52.298552](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2023-12-10T01-51-52.298552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6460435872902499,
"acc_stderr": 0.03203449074198557,
"acc_norm": 0.6469349129421068,
"acc_norm_stderr": 0.032681317097745945,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.627788323256757,
"mc2_stderr": 0.014997858897015229
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094092,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173311
},
"harness|hellaswag|10": {
"acc": 0.667894841665007,
"acc_stderr": 0.00470005967137464,
"acc_norm": 0.861979685321649,
"acc_norm_stderr": 0.0034421638433628794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406796,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601432,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.016125543823552954,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.016125543823552954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032197,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032197
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316268,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316268
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960884,
"mc2": 0.627788323256757,
"mc2_stderr": 0.014997858897015229
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987726
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp | [
"region:us"
] | 2023-12-10T01:54:43+00:00 | {"pretty_name": "Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T01:51:52.298552](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2023-12-10T01-51-52.298552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6460435872902499,\n \"acc_stderr\": 0.03203449074198557,\n \"acc_norm\": 0.6469349129421068,\n \"acc_norm_stderr\": 0.032681317097745945,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.627788323256757,\n \"mc2_stderr\": 0.014997858897015229\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094092,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173311\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.667894841665007,\n \"acc_stderr\": 0.00470005967137464,\n \"acc_norm\": 0.861979685321649,\n \"acc_norm_stderr\": 0.0034421638433628794\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406796,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406796\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601432,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601432\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n \"acc_stderr\": 0.016125543823552954,\n \"acc_norm\": 0.3675977653631285,\n \"acc_norm_stderr\": 0.016125543823552954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032197,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032197\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316268,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316268\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.017448017223960884,\n \"mc2\": 0.627788323256757,\n \"mc2_stderr\": 0.014997858897015229\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987726\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \"acc_stderr\": 0.012872435481188778\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|arc:challenge|25_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|gsm8k|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hellaswag|10_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T01-51-52.298552.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["**/details_harness|winogrande|5_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T01-51-52.298552.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T01_51_52.298552", "path": ["results_2023-12-10T01-51-52.298552.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T01-51-52.298552.parquet"]}]}]} | 2023-12-10T01:55:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T01:51:52.298552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model PulsarAI... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluat... | [
6,
30,
31,
179,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/OpenHermes-2.5-neural-chat-v3-3-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run o... |
d52f7989c0968eedcf0d64ac4aae3ba4ea1a6550 | # Dataset Card for "mbxp_wasm_no_funcname"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | JeremiahZ/mbxp_wasm_no_funcname | [
"region:us"
] | 2023-12-10T01:57:07+00:00 | {"dataset_info": {"features": [{"name": "task_id", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "test", "dtype": "string"}, {"name": "entry_point", "dtype": "string"}, {"name": "canonical_solution", "dtype": "string"}, {"name": "wat", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3916582, "num_examples": 773}], "download_size": 956941, "dataset_size": 3916582}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T01:57:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mbxp_wasm_no_funcname"
More Information needed | [
"# Dataset Card for \"mbxp_wasm_no_funcname\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mbxp_wasm_no_funcname\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mbxp_wasm_no_funcname\"\n\nMore Information needed"
] |
5285730a29262dad6e1cbd4e5e167b7292a6e17c |
# Dataset Card for Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T02:45:05.724710](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2023-12-10T02-45-05.724710.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6464664842416276,
"acc_stderr": 0.03217172590988582,
"acc_norm": 0.646376680571289,
"acc_norm_stderr": 0.032836550184029964,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5514034273421413,
"mc2_stderr": 0.015341235748555455
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756564
},
"harness|hellaswag|10": {
"acc": 0.6632144991037642,
"acc_stderr": 0.004716449792353795,
"acc_norm": 0.8539135630352519,
"acc_norm_stderr": 0.003524710243768616
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944447,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944447
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.01629533232815581,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.01629533232815581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5514034273421413,
"mc2_stderr": 0.015341235748555455
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626915
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.012415070917508124
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_PulsarAI__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp | [
"region:us"
] | 2023-12-10T02:47:58+00:00 | {"pretty_name": "Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T02:45:05.724710](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2023-12-10T02-45-05.724710.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6464664842416276,\n \"acc_stderr\": 0.03217172590988582,\n \"acc_norm\": 0.646376680571289,\n \"acc_norm_stderr\": 0.032836550184029964,\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5514034273421413,\n \"mc2_stderr\": 0.015341235748555455\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756564\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6632144991037642,\n \"acc_stderr\": 0.004716449792353795,\n \"acc_norm\": 0.8539135630352519,\n \"acc_norm_stderr\": 0.003524710243768616\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944447,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944447\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n \"acc_stderr\": 0.01629533232815581,\n \"acc_norm\": 0.3877094972067039,\n \"acc_norm_stderr\": 0.01629533232815581\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5514034273421413,\n \"mc2_stderr\": 0.015341235748555455\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626915\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \"acc_stderr\": 0.012415070917508124\n }\n}\n```", "repo_url": "https://huggingface.co/PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|arc:challenge|25_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|gsm8k|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hellaswag|10_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T02-45-05.724710.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["**/details_harness|winogrande|5_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T02-45-05.724710.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T02_45_05.724710", "path": ["results_2023-12-10T02-45-05.724710.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T02-45-05.724710.parquet"]}]}]} | 2023-12-10T02:48:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T02:45:05.724710(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during th... | [
6,
34,
31,
183,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PulsarAI/MetaMath-OpenHermes-2.5-neural-chat-v3-3-Slerp## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluat... |
5df6118d465bbc82a8d1d76d93050e501279db17 |
# Dataset Card for Evaluation run of sequelbox/SunsetBoulevard
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/SunsetBoulevard
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/SunsetBoulevard](https://huggingface.co/sequelbox/SunsetBoulevard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__SunsetBoulevard",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T03:02:57.544409](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SunsetBoulevard/blob/main/results_2023-12-10T03-02-57.544409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7110861444687467,
"acc_stderr": 0.030063430253086363,
"acc_norm": 0.7154441745417264,
"acc_norm_stderr": 0.030639690759115483,
"mc1": 0.5569155446756426,
"mc1_stderr": 0.01738973034687711,
"mc2": 0.7029226076594556,
"mc2_stderr": 0.013335950631417065
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274776
},
"harness|hellaswag|10": {
"acc": 0.7438757219677355,
"acc_stderr": 0.004355992090031012,
"acc_norm": 0.9095797649870544,
"acc_norm_stderr": 0.0028619676953189122
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.0271342916287417,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.0271342916287417
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493848,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493848
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.02157624818451459,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.02157624818451459
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983127,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983127
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.02702543349888238,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.02702543349888238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230444,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8295964125560538,
"acc_stderr": 0.025234593447136175,
"acc_norm": 0.8295964125560538,
"acc_norm_stderr": 0.025234593447136175
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.028767481725983854,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.028767481725983854
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132346,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132346
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999876,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999876
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.02162807738019612,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.02162807738019612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6122905027932961,
"acc_stderr": 0.016295332328155807,
"acc_norm": 0.6122905027932961,
"acc_norm_stderr": 0.016295332328155807
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02073635840806,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02073635840806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.599290780141844,
"acc_stderr": 0.029233465745573096,
"acc_norm": 0.599290780141844,
"acc_norm_stderr": 0.029233465745573096
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5684485006518905,
"acc_stderr": 0.012650007999463909,
"acc_norm": 0.5684485006518905,
"acc_norm_stderr": 0.012650007999463909
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887667,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887667
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.01707737337785693,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.01707737337785693
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954619,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954619
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5569155446756426,
"mc1_stderr": 0.01738973034687711,
"mc2": 0.7029226076594556,
"mc2_stderr": 0.013335950631417065
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.5466262319939348,
"acc_stderr": 0.013712471049515446
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_sequelbox__SunsetBoulevard | [
"region:us"
] | 2023-12-10T03:05:57+00:00 | {"pretty_name": "Evaluation run of sequelbox/SunsetBoulevard", "dataset_summary": "Dataset automatically created during the evaluation run of model [sequelbox/SunsetBoulevard](https://huggingface.co/sequelbox/SunsetBoulevard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__SunsetBoulevard\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T03:02:57.544409](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SunsetBoulevard/blob/main/results_2023-12-10T03-02-57.544409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7110861444687467,\n \"acc_stderr\": 0.030063430253086363,\n \"acc_norm\": 0.7154441745417264,\n \"acc_norm_stderr\": 0.030639690759115483,\n \"mc1\": 0.5569155446756426,\n \"mc1_stderr\": 0.01738973034687711,\n \"mc2\": 0.7029226076594556,\n \"mc2_stderr\": 0.013335950631417065\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274776\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7438757219677355,\n \"acc_stderr\": 0.004355992090031012,\n \"acc_norm\": 0.9095797649870544,\n \"acc_norm_stderr\": 0.0028619676953189122\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.0271342916287417,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.0271342916287417\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493848,\n \"acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493848\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.02157624818451459,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.02157624818451459\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983127,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983127\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.02702543349888238,\n \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.02702543349888238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230444,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n \"acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8295964125560538,\n \"acc_stderr\": 0.025234593447136175,\n \"acc_norm\": 0.8295964125560538,\n \"acc_norm_stderr\": 0.025234593447136175\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.028767481725983854,\n \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.028767481725983854\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.017004368568132346,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.017004368568132346\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n \"acc_stderr\": 0.011935626313999876,\n \"acc_norm\": 0.8722860791826309,\n \"acc_norm_stderr\": 0.011935626313999876\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.02162807738019612,\n \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.02162807738019612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6122905027932961,\n \"acc_stderr\": 0.016295332328155807,\n \"acc_norm\": 0.6122905027932961,\n \"acc_norm_stderr\": 0.016295332328155807\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02073635840806,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02073635840806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.599290780141844,\n \"acc_stderr\": 0.029233465745573096,\n \"acc_norm\": 0.599290780141844,\n \"acc_norm_stderr\": 0.029233465745573096\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5684485006518905,\n \"acc_stderr\": 0.012650007999463909,\n \"acc_norm\": 0.5684485006518905,\n \"acc_norm_stderr\": 0.012650007999463909\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887667,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887667\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.01707737337785693,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.01707737337785693\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954619,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954619\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5569155446756426,\n \"mc1_stderr\": 0.01738973034687711,\n \"mc2\": 0.7029226076594556,\n \"mc2_stderr\": 0.013335950631417065\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5466262319939348,\n \"acc_stderr\": 0.013712471049515446\n }\n}\n```", "repo_url": "https://huggingface.co/sequelbox/SunsetBoulevard", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|arc:challenge|25_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|gsm8k|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hellaswag|10_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T03-02-57.544409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["**/details_harness|winogrande|5_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T03-02-57.544409.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T03_02_57.544409", "path": ["results_2023-12-10T03-02-57.544409.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T03-02-57.544409.parquet"]}]}]} | 2023-12-10T03:06:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sequelbox/SunsetBoulevard
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model sequelbox/SunsetBoulevard on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T03:02:57.544409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of sequelbox/SunsetBoulevard",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model sequelbox/SunsetBoulevard on ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sequelbox/SunsetBoulevard",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model sequ... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sequelbox/SunsetBoulevard## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model sequelbox/Sun... |
4d2a9c731160fcb79fc97c3f826fa20962f40208 |
# Dataset Card for Evaluation run of chargoddard/piano-medley-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/piano-medley-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/piano-medley-7b](https://huggingface.co/chargoddard/piano-medley-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__piano-medley-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T03:24:54.482171](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__piano-medley-7b/blob/main/results_2023-12-10T03-24-54.482171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6462767300930756,
"acc_stderr": 0.032134853847514466,
"acc_norm": 0.6489933678568897,
"acc_norm_stderr": 0.03277330582106223,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6142054505900651,
"mc2_stderr": 0.015456544162012987
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518826
},
"harness|hellaswag|10": {
"acc": 0.6645090619398526,
"acc_stderr": 0.004711968379069029,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.0035276951498235004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.04115324610336953,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.04115324610336953
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126253,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126253
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662253,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.01271540484127774,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.01271540484127774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6142054505900651,
"mc2_stderr": 0.015456544162012987
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.5655799848369977,
"acc_stderr": 0.013653507211411417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_chargoddard__piano-medley-7b | [
"region:us"
] | 2023-12-10T03:27:47+00:00 | {"pretty_name": "Evaluation run of chargoddard/piano-medley-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/piano-medley-7b](https://huggingface.co/chargoddard/piano-medley-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__piano-medley-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T03:24:54.482171](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__piano-medley-7b/blob/main/results_2023-12-10T03-24-54.482171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6462767300930756,\n \"acc_stderr\": 0.032134853847514466,\n \"acc_norm\": 0.6489933678568897,\n \"acc_norm_stderr\": 0.03277330582106223,\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6142054505900651,\n \"mc2_stderr\": 0.015456544162012987\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518826\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6645090619398526,\n \"acc_stderr\": 0.004711968379069029,\n \"acc_norm\": 0.8536148177653854,\n \"acc_norm_stderr\": 0.0035276951498235004\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.04115324610336953,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.04115324610336953\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126253,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126253\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662253,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.01271540484127774,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.01271540484127774\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6142054505900651,\n \"mc2_stderr\": 0.015456544162012987\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5655799848369977,\n \"acc_stderr\": 0.013653507211411417\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/piano-medley-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|arc:challenge|25_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|gsm8k|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hellaswag|10_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["**/details_harness|winogrande|5_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T03-24-54.482171.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T03_24_54.482171", "path": ["results_2023-12-10T03-24-54.482171.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T03-24-54.482171.parquet"]}]}]} | 2023-12-10T03:28:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/piano-medley-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model chargoddard/piano-medley-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T03:24:54.482171(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of chargoddard/piano-medley-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/piano-medley-7b... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/piano-medley-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ch... | [
6,
21,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/piano-medley-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard... |
fe9a7c5cfba77d1c0f7185474254642a80b15ed8 |
This is a direct Chinese translation using GPT4 of the Verified-Camel dataset. I hope you find it useful.
https://huggingface.co/datasets/LDJnr/Verified-Camel
Citation:
```
@article{daniele2023amplify-instruct,
title={Amplify-Instruct: Synthetically Generated Diverse Multi-turn Conversations for Effecient LLM Training.},
author={Daniele, Luigi and Suphavadeeprasit},
journal={arXiv preprint arXiv:(comming soon)},
year={2023}
}
``` | noobmaster29/Verified-Camel-zh | [
"task_categories:conversational",
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"language:zh",
"license:apache-2.0",
"Physics",
"Chemistry",
"Math",
"Biology",
"Culture",
"Logic",
"region:us"
] | 2023-12-10T03:40:28+00:00 | {"language": ["en", "zh"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["conversational", "question-answering", "text-generation"], "tags": ["Physics", "Chemistry", "Math", "Biology", "Culture", "Logic"]} | 2023-12-10T03:57:00+00:00 | [] | [
"en",
"zh"
] | TAGS
#task_categories-conversational #task_categories-question-answering #task_categories-text-generation #size_categories-n<1K #language-English #language-Chinese #license-apache-2.0 #Physics #Chemistry #Math #Biology #Culture #Logic #region-us
|
This is a direct Chinese translation using GPT4 of the Verified-Camel dataset. I hope you find it useful.
URL
Citation:
| [] | [
"TAGS\n#task_categories-conversational #task_categories-question-answering #task_categories-text-generation #size_categories-n<1K #language-English #language-Chinese #license-apache-2.0 #Physics #Chemistry #Math #Biology #Culture #Logic #region-us \n"
] | [
86
] | [
"passage: TAGS\n#task_categories-conversational #task_categories-question-answering #task_categories-text-generation #size_categories-n<1K #language-English #language-Chinese #license-apache-2.0 #Physics #Chemistry #Math #Biology #Culture #Logic #region-us \n"
] |
4caac69447532b1d6d6fb0e26154c274cd1b32b0 |
The following data set was vectorized with the [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) model and an index file created by faiss.
[oshizo/japanese-wikipedia-paragraphs](https://huggingface.co/datasets/oshizo/japanese-wikipedia-paragraphs)
## Usage
First, download index_me5-base_IVF2048_PQ192.faiss from this repository.
```python
import faiss
import datasets
from sentence_transformers import SentenceTransformer
ds = datasets.load_dataset("oshizo/japanese-wikipedia-paragraphs", split="train")
index = faiss.read_index("./index_me5-base_IVF2048_PQ192.faiss")
model = SentenceTransformer("intfloat/multilingual-e5-base")
question = "日本で二番目に高い山は?"
emb = model.encode(["query: " + question])
scores, indexes = index.search(emb, 10)
scores = scores[0]
indexes = indexes[0]
results = []
for idx, score in zip(indexes, scores):
idx = int(idx)
passage = ds[idx]
passage["score"] = score
results.append((passage))
| oshizo/japanese-wikipedia-paragraphs-embeddings | [
"language:ja",
"license:cc-by-sa-4.0",
"region:us"
] | 2023-12-10T03:41:14+00:00 | {"language": ["ja"], "license": "cc-by-sa-4.0"} | 2023-12-15T13:16:42+00:00 | [] | [
"ja"
] | TAGS
#language-Japanese #license-cc-by-sa-4.0 #region-us
|
The following data set was vectorized with the intfloat/multilingual-e5-base model and an index file created by faiss.
oshizo/japanese-wikipedia-paragraphs
## Usage
First, download index_me5-base_IVF2048_PQ192.faiss from this repository.
'''python
import faiss
import datasets
from sentence_transformers import SentenceTransformer
ds = datasets.load_dataset("oshizo/japanese-wikipedia-paragraphs", split="train")
index = faiss.read_index("./index_me5-base_IVF2048_PQ192.faiss")
model = SentenceTransformer("intfloat/multilingual-e5-base")
question = "日本で二番目に高い山は?"
emb = URL(["query: " + question])
scores, indexes = URL(emb, 10)
scores = scores[0]
indexes = indexes[0]
results = []
for idx, score in zip(indexes, scores):
idx = int(idx)
passage = ds[idx]
passage["score"] = score
URL((passage))
| [
"## Usage\n\nFirst, download index_me5-base_IVF2048_PQ192.faiss from this repository.\n\n'''python\nimport faiss\nimport datasets\nfrom sentence_transformers import SentenceTransformer\n\nds = datasets.load_dataset(\"oshizo/japanese-wikipedia-paragraphs\", split=\"train\")\n\nindex = faiss.read_index(\"./index_me5-... | [
"TAGS\n#language-Japanese #license-cc-by-sa-4.0 #region-us \n",
"## Usage\n\nFirst, download index_me5-base_IVF2048_PQ192.faiss from this repository.\n\n'''python\nimport faiss\nimport datasets\nfrom sentence_transformers import SentenceTransformer\n\nds = datasets.load_dataset(\"oshizo/japanese-wikipedia-paragra... | [
23,
238
] | [
"passage: TAGS\n#language-Japanese #license-cc-by-sa-4.0 #region-us \n## Usage\n\nFirst, download index_me5-base_IVF2048_PQ192.faiss from this repository.\n\n'''python\nimport faiss\nimport datasets\nfrom sentence_transformers import SentenceTransformer\n\nds = datasets.load_dataset(\"oshizo/japanese-wikipedia-para... |
00ffcde7262f2ca67ed149328b86712014e7ee4b | # Synthetic Malaysian QA
Generated common QA using ChatGPT3 for,
1. Agrobank
2. Bank Negara Malaysia
3. Bank Perusahaan Kecil dan Sederhana Malaysia
4. Bank Rakyat
5. Bank Simpanan Nasional
6. Bursa Malaysia
7. Dewan Bahasa dan Pustaka
8. Institut Kesihatan Umum
9. Institut Penyelidikan Perubatan
10. Institut Penyelidikan Sains dan Teknologi Pertahanan
11. Institut Penyelidikan Tingkahlaku Kesihatan
12. Institut Penyelidikan dan Kemajuan Pertanian Malaysia
13. Jabatan Akauntan Negara
14. Jabatan Bomba dan Penyelamat Malaysia
15. Jabatan Hal Ehwal Kesatuan Sekerja
16. Jabatan Hal Ehwal Veteran
17. Jabatan Imigresen Malaysia
18. Jabatan Kastam Diraja Malaysia
19. Jabatan Kebajikan Masyarakat
20. Jabatan Kemajuan Orang Asli
21. Jabatan Kerajaan Tempatan
22. Jabatan Kerja Raya
23. Jabatan Keselamatan Jalan Raya
24. Jabatan Keselamatan dan Keselamatan Pekerjaan
25. Jabatan Ketua Hakim Peguam
26. Jabatan Landskap Negara
27. Jabatan Latihan Khidmat Negara
28. Jabatan Laut Malaysia
29. Jabatan Pembangunan Wanita
30. Jabatan Pendaftaran Pertubuhan Malaysia
31. Jabatan Penerangan Malaysia
32. Jabatan Pengangkutan Jalan
33. Jabatan Pengurusan Sisa Pepejal Negara
34. Jabatan Penilaian dan Perkhidmatan Negara
35. Jabatan Penjara Malaysia
36. Jabatan Perancangan Bandar dan Desa
37. Jabatan Perdana Menteri Malaysia
38. Jabatan Perhubungan Perusahaan
39. Jabatan Perikanan Malaysia
40. Jabatan Perkhidmatan Kuarantin dan Pemeriksaan Malaysia
41. Jabatan Perkhidmatan Veterinar
42. Jabatan Pertanian Malaysia
43. Jabatan Perumahan Negara
44. Jabatan Perumahan dan Pengurusan Strata
45. Jabatan Sukarelawan Malaysia
46. Jabatan Tenaga Kerja
47. Jabatan Tenaga Kerja Manusia
48. Khazanah Nasional
49. Kolej Pertanian
50. Kumpulan Wang Persaraan
51. Kumpulan Wang Simpanan Pekerja
52. Lembaga Hasil Dalam Negeri Malaysia
53. Lembaga Kemajuan Ikan Malaysia
54. Lembaga Kemajuan Pertanian Kemubu
55. Lembaga Kemajuan Pertanian Muda
56. Lembaga Pelabuhan Bintulu
57. Lembaga Pelabuhan Johor
58. Lembaga Pelabuhan Klang
59. Lembaga Pelabuhan Kuantan
60. Lembaga Pemasaran Pertanian Persekutuan
61. Lembaga Pembangunan Pelaburan Malaysia
62. Lembaga Pembiayaan Perumahan Sektor Awam
63. Lembaga Penapisan Filem
64. Lembaga Penduduk dan Pembangunan Keluarga Negara
65. Lembaga Peperiksaan Malaysia
66. Lembaga Perindustrian Nanas Malaysia
67. Lembaga Perkhidmatan Kewangan Labuan
68. Lembaga Pertubuhan Peladang
69. Lembaga Promosi Kesihatan Malaysia
70. Lembaga Totalisator Malaysia
71. Pusat Pergigian Kanak-Kanak & Kolej Latihan Pergigian Malaysia
Notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/question-answer/chatgpt3.5-synthetic-malaysian-qa | mesolitica/chatgpt-malaysian-general-qa | [
"region:us"
] | 2023-12-10T04:12:32+00:00 | {} | 2023-12-10T18:50:21+00:00 | [] | [] | TAGS
#region-us
| # Synthetic Malaysian QA
Generated common QA using ChatGPT3 for,
1. Agrobank
2. Bank Negara Malaysia
3. Bank Perusahaan Kecil dan Sederhana Malaysia
4. Bank Rakyat
5. Bank Simpanan Nasional
6. Bursa Malaysia
7. Dewan Bahasa dan Pustaka
8. Institut Kesihatan Umum
9. Institut Penyelidikan Perubatan
10. Institut Penyelidikan Sains dan Teknologi Pertahanan
11. Institut Penyelidikan Tingkahlaku Kesihatan
12. Institut Penyelidikan dan Kemajuan Pertanian Malaysia
13. Jabatan Akauntan Negara
14. Jabatan Bomba dan Penyelamat Malaysia
15. Jabatan Hal Ehwal Kesatuan Sekerja
16. Jabatan Hal Ehwal Veteran
17. Jabatan Imigresen Malaysia
18. Jabatan Kastam Diraja Malaysia
19. Jabatan Kebajikan Masyarakat
20. Jabatan Kemajuan Orang Asli
21. Jabatan Kerajaan Tempatan
22. Jabatan Kerja Raya
23. Jabatan Keselamatan Jalan Raya
24. Jabatan Keselamatan dan Keselamatan Pekerjaan
25. Jabatan Ketua Hakim Peguam
26. Jabatan Landskap Negara
27. Jabatan Latihan Khidmat Negara
28. Jabatan Laut Malaysia
29. Jabatan Pembangunan Wanita
30. Jabatan Pendaftaran Pertubuhan Malaysia
31. Jabatan Penerangan Malaysia
32. Jabatan Pengangkutan Jalan
33. Jabatan Pengurusan Sisa Pepejal Negara
34. Jabatan Penilaian dan Perkhidmatan Negara
35. Jabatan Penjara Malaysia
36. Jabatan Perancangan Bandar dan Desa
37. Jabatan Perdana Menteri Malaysia
38. Jabatan Perhubungan Perusahaan
39. Jabatan Perikanan Malaysia
40. Jabatan Perkhidmatan Kuarantin dan Pemeriksaan Malaysia
41. Jabatan Perkhidmatan Veterinar
42. Jabatan Pertanian Malaysia
43. Jabatan Perumahan Negara
44. Jabatan Perumahan dan Pengurusan Strata
45. Jabatan Sukarelawan Malaysia
46. Jabatan Tenaga Kerja
47. Jabatan Tenaga Kerja Manusia
48. Khazanah Nasional
49. Kolej Pertanian
50. Kumpulan Wang Persaraan
51. Kumpulan Wang Simpanan Pekerja
52. Lembaga Hasil Dalam Negeri Malaysia
53. Lembaga Kemajuan Ikan Malaysia
54. Lembaga Kemajuan Pertanian Kemubu
55. Lembaga Kemajuan Pertanian Muda
56. Lembaga Pelabuhan Bintulu
57. Lembaga Pelabuhan Johor
58. Lembaga Pelabuhan Klang
59. Lembaga Pelabuhan Kuantan
60. Lembaga Pemasaran Pertanian Persekutuan
61. Lembaga Pembangunan Pelaburan Malaysia
62. Lembaga Pembiayaan Perumahan Sektor Awam
63. Lembaga Penapisan Filem
64. Lembaga Penduduk dan Pembangunan Keluarga Negara
65. Lembaga Peperiksaan Malaysia
66. Lembaga Perindustrian Nanas Malaysia
67. Lembaga Perkhidmatan Kewangan Labuan
68. Lembaga Pertubuhan Peladang
69. Lembaga Promosi Kesihatan Malaysia
70. Lembaga Totalisator Malaysia
71. Pusat Pergigian Kanak-Kanak & Kolej Latihan Pergigian Malaysia
Notebooks at URL | [
"# Synthetic Malaysian QA\n\nGenerated common QA using ChatGPT3 for,\n1. Agrobank\n2. Bank Negara Malaysia\n3. Bank Perusahaan Kecil dan Sederhana Malaysia\n4. Bank Rakyat\n5. Bank Simpanan Nasional\n6. Bursa Malaysia\n7. Dewan Bahasa dan Pustaka\n8. Institut Kesihatan Umum\n9. Institut Penyelidikan Perubatan\n10. ... | [
"TAGS\n#region-us \n",
"# Synthetic Malaysian QA\n\nGenerated common QA using ChatGPT3 for,\n1. Agrobank\n2. Bank Negara Malaysia\n3. Bank Perusahaan Kecil dan Sederhana Malaysia\n4. Bank Rakyat\n5. Bank Simpanan Nasional\n6. Bursa Malaysia\n7. Dewan Bahasa dan Pustaka\n8. Institut Kesihatan Umum\n9. Institut Pen... | [
6,
493
] | [
"passage: TAGS\n#region-us \n# Synthetic Malaysian QA\n\nGenerated common QA using ChatGPT3 for,\n1. Agrobank\n2. Bank Negara Malaysia\n3. Bank Perusahaan Kecil dan Sederhana Malaysia\n4. Bank Rakyat\n5. Bank Simpanan Nasional\n6. Bursa Malaysia\n7. Dewan Bahasa dan Pustaka\n8. Institut Kesihatan Umum\n9. Institut ... |
f47538209787f1ba555045f666da0145b6ed8381 | # Dataset Card for "tiny_stories_packed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | P1ayer-1/tiny_stories_packed | [
"region:us"
] | 2023-12-10T04:40:00+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 2146599252.0, "num_examples": 1046101}], "download_size": 894178226, "dataset_size": 2146599252.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T04:44:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "tiny_stories_packed"
More Information needed | [
"# Dataset Card for \"tiny_stories_packed\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"tiny_stories_packed\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"tiny_stories_packed\"\n\nMore Information needed"
] |
f9c63c9d398aa41d065a272cbfd4e59cf9cc901f |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama-Q](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T04:42:16.291896](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q/blob/main/results_2023-12-10T04-42-16.291896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.78055182695593,
"acc_stderr": 0.02737501256983463,
"acc_norm": 0.7866450755467821,
"acc_norm_stderr": 0.027870412250259477,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5363792883186823,
"mc2_stderr": 0.014951574037726555
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303096,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.01387242322371816
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.004746394613384537,
"acc_norm": 0.8522206731726748,
"acc_norm_stderr": 0.00354155826377912
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7555555555555555,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.7555555555555555,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693605,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100824,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100824
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8137931034482758,
"acc_stderr": 0.03243946159004616,
"acc_norm": 0.8137931034482758,
"acc_norm_stderr": 0.03243946159004616
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7619047619047619,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.7619047619047619,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9290322580645162,
"acc_stderr": 0.01460718907324613,
"acc_norm": 0.9290322580645162,
"acc_norm_stderr": 0.01460718907324613
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7044334975369458,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.7044334975369458,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993107,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993107
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909036,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8358974358974359,
"acc_stderr": 0.01877843431342371,
"acc_norm": 0.8358974358974359,
"acc_norm_stderr": 0.01877843431342371
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.030478009819615823,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.030478009819615823
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.02092847255778879,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.02092847255778879
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5695364238410596,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.5695364238410596,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9376146788990826,
"acc_stderr": 0.01036940784904345,
"acc_norm": 0.9376146788990826,
"acc_norm_stderr": 0.01036940784904345
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.03128039084329881,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.03128039084329881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9240506329113924,
"acc_stderr": 0.01724463325106569,
"acc_norm": 0.9240506329113924,
"acc_norm_stderr": 0.01724463325106569
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.02513538235660422,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.02513538235660422
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9259259259259259,
"acc_stderr": 0.025317997297209734,
"acc_norm": 0.9259259259259259,
"acc_norm_stderr": 0.025317997297209734
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6964285714285714,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.6964285714285714,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881349,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881349
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813235,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813235
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292853,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292853
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7642458100558659,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.7642458100558659,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.020823758837580916,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.020823758837580916
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8488745980707395,
"acc_stderr": 0.02034274974442864,
"acc_norm": 0.8488745980707395,
"acc_norm_stderr": 0.02034274974442864
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571853,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02812163604063989,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02812163604063989
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.622555410691004,
"acc_stderr": 0.012380680911165804,
"acc_norm": 0.622555410691004,
"acc_norm_stderr": 0.012380680911165804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581774,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8251633986928104,
"acc_stderr": 0.015366167064780648,
"acc_norm": 0.8251633986928104,
"acc_norm_stderr": 0.015366167064780648
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5363792883186823,
"mc2_stderr": 0.014951574037726555
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363698
},
"harness|gsm8k|5": {
"acc": 0.604245640636846,
"acc_stderr": 0.013469823701048806
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q | [
"region:us"
] | 2023-12-10T04:45:04+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-Llama-Q", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama-Q](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T04:42:16.291896](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q/blob/main/results_2023-12-10T04-42-16.291896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.78055182695593,\n \"acc_stderr\": 0.02737501256983463,\n \"acc_norm\": 0.7866450755467821,\n \"acc_norm_stderr\": 0.027870412250259477,\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5363792883186823,\n \"mc2_stderr\": 0.014951574037726555\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303096,\n \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.01387242322371816\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n \"acc_stderr\": 0.004746394613384537,\n \"acc_norm\": 0.8522206731726748,\n \"acc_norm_stderr\": 0.00354155826377912\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693605,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100824,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100824\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8137931034482758,\n \"acc_stderr\": 0.03243946159004616,\n \"acc_norm\": 0.8137931034482758,\n \"acc_norm_stderr\": 0.03243946159004616\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7619047619047619,\n \"acc_stderr\": 0.021935878081184763,\n \"acc_norm\": 0.7619047619047619,\n \"acc_norm_stderr\": 0.021935878081184763\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9290322580645162,\n \"acc_stderr\": 0.01460718907324613,\n \"acc_norm\": 0.9290322580645162,\n \"acc_norm_stderr\": 0.01460718907324613\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.7044334975369458,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.7044334975369458,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993107,\n \"acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993107\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909036,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8358974358974359,\n \"acc_stderr\": 0.01877843431342371,\n \"acc_norm\": 0.8358974358974359,\n \"acc_norm_stderr\": 0.01877843431342371\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.030478009819615823,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.030478009819615823\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8823529411764706,\n \"acc_stderr\": 0.02092847255778879,\n \"acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.02092847255778879\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9376146788990826,\n \"acc_stderr\": 0.01036940784904345,\n \"acc_norm\": 0.9376146788990826,\n \"acc_norm_stderr\": 0.01036940784904345\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329881,\n \"acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329881\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9240506329113924,\n \"acc_stderr\": 0.01724463325106569,\n \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.01724463325106569\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9173553719008265,\n \"acc_stderr\": 0.02513538235660422,\n \"acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.02513538235660422\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9259259259259259,\n \"acc_stderr\": 0.025317997297209734,\n \"acc_norm\": 0.9259259259259259,\n \"acc_norm_stderr\": 0.025317997297209734\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6964285714285714,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.6964285714285714,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881349,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881349\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813235,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813235\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292853,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292853\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7642458100558659,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.7642458100558659,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.020823758837580916,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.020823758837580916\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8488745980707395,\n \"acc_stderr\": 0.02034274974442864,\n \"acc_norm\": 0.8488745980707395,\n \"acc_norm_stderr\": 0.02034274974442864\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571853,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02812163604063989,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02812163604063989\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.622555410691004,\n \"acc_stderr\": 0.012380680911165804,\n \"acc_norm\": 0.622555410691004,\n \"acc_norm_stderr\": 0.012380680911165804\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581774,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581774\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8251633986928104,\n \"acc_stderr\": 0.015366167064780648,\n \"acc_norm\": 0.8251633986928104,\n \"acc_norm_stderr\": 0.015366167064780648\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5363792883186823,\n \"mc2_stderr\": 0.014951574037726555\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363698\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \"acc_stderr\": 0.013469823701048806\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|arc:challenge|25_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|gsm8k|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hellaswag|10_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T04-42-16.291896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["**/details_harness|winogrande|5_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T04-42-16.291896.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T04_42_16.291896", "path": ["results_2023-12-10T04-42-16.291896.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T04-42-16.291896.parquet"]}]}]} | 2023-12-10T04:45:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama-Q on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T04:42:16.291896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama-Q... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ky... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/Pl... |
a7a1a2eefd6bef1a9f99cf12c85e701cdb1baa98 |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200K-Q
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-200K-Q
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-200K-Q](https://huggingface.co/kyujinpy/PlatYi-34B-200K-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200K-Q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:34:24.325158](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200K-Q/blob/main/results_2023-12-10T05-34-24.325158.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7400651755080421,
"acc_stderr": 0.02871860714656746,
"acc_norm": 0.7513652661162374,
"acc_norm_stderr": 0.029282495673523156,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.44207231913277784,
"mc2_stderr": 0.015063393630524507
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279542,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6336387173869747,
"acc_stderr": 0.00480825126968244,
"acc_norm": 0.8351921927902808,
"acc_norm_stderr": 0.0037024876621269487
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774631,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774631
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6693121693121693,
"acc_stderr": 0.02422996529842509,
"acc_norm": 0.6693121693121693,
"acc_norm_stderr": 0.02422996529842509
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034495,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047926,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047926
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262584,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262584
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.764102564102564,
"acc_stderr": 0.02152596540740873,
"acc_norm": 0.764102564102564,
"acc_norm_stderr": 0.02152596540740873
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660077,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089674,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758528,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758528
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917947,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917947
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.032881802788086285,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.032881802788086285
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.01500631280644693,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.01500631280644693
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8876117496807152,
"acc_stderr": 0.011294541351216554,
"acc_norm": 0.8876117496807152,
"acc_norm_stderr": 0.011294541351216554
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657527,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657527
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.582122905027933,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.582122905027933,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213502,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213502
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257135,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6063829787234043,
"acc_stderr": 0.029144544781596157,
"acc_norm": 0.6063829787234043,
"acc_norm_stderr": 0.029144544781596157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6003911342894394,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.6003911342894394,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559352,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.016062056421968646,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.016062056421968646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.44207231913277784,
"mc2_stderr": 0.015063393630524507
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989247
},
"harness|gsm8k|5": {
"acc": 0.24109173616376042,
"acc_stderr": 0.011782246325099723
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200K-Q | [
"region:us"
] | 2023-12-10T05:37:13+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-200K-Q", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-200K-Q](https://huggingface.co/kyujinpy/PlatYi-34B-200K-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200K-Q\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T05:34:24.325158](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200K-Q/blob/main/results_2023-12-10T05-34-24.325158.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7400651755080421,\n \"acc_stderr\": 0.02871860714656746,\n \"acc_norm\": 0.7513652661162374,\n \"acc_norm_stderr\": 0.029282495673523156,\n \"mc1\": 0.32068543451652387,\n \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.44207231913277784,\n \"mc2_stderr\": 0.015063393630524507\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279542,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6336387173869747,\n \"acc_stderr\": 0.00480825126968244,\n \"acc_norm\": 0.8351921927902808,\n \"acc_norm_stderr\": 0.0037024876621269487\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774631,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774631\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6693121693121693,\n \"acc_stderr\": 0.02422996529842509,\n \"acc_norm\": 0.6693121693121693,\n \"acc_norm_stderr\": 0.02422996529842509\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034495,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03255086769970103,\n \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03255086769970103\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047926,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047926\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262584,\n \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262584\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.764102564102564,\n \"acc_stderr\": 0.02152596540740873,\n \"acc_norm\": 0.764102564102564,\n \"acc_norm_stderr\": 0.02152596540740873\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660077,\n \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660077\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089674,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758528,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758528\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917947,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917947\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.032881802788086285,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.032881802788086285\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.01500631280644693,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.01500631280644693\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8876117496807152,\n \"acc_stderr\": 0.011294541351216554,\n \"acc_norm\": 0.8876117496807152,\n \"acc_norm_stderr\": 0.011294541351216554\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657527,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657527\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.582122905027933,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.582122905027933,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213502,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213502\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257135,\n \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6063829787234043,\n \"acc_stderr\": 0.029144544781596157,\n \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.029144544781596157\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6003911342894394,\n \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.6003911342894394,\n \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559352,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.016062056421968646,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.016062056421968646\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.44207231913277784,\n \"mc2_stderr\": 0.015063393630524507\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24109173616376042,\n \"acc_stderr\": 0.011782246325099723\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-200K-Q", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-34-24.325158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["**/details_harness|winogrande|5_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T05-34-24.325158.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T05_34_24.325158", "path": ["results_2023-12-10T05-34-24.325158.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T05-34-24.325158.parquet"]}]}]} | 2023-12-10T05:37:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200K-Q
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-200K-Q on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T05:34:24.325158(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200K-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-200K-Q o... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200K-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyu... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200K-Q## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/Pla... |
8c843804cfdb78801edb2e757a44080ba5b77191 |
# Dataset Card for Evaluation run of rwitz/go-bruins-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rwitz/go-bruins-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rwitz__go-bruins-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:42:16.717744](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz__go-bruins-v2/blob/main/results_2023-12-10T05-42-16.717744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6521685007083396,
"acc_stderr": 0.03205721368340006,
"acc_norm": 0.6521344188001463,
"acc_norm_stderr": 0.032717447545898726,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195974,
"mc2": 0.5970340702765861,
"mc2_stderr": 0.015540536389561436
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760424,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.01341751914471641
},
"harness|hellaswag|10": {
"acc": 0.6937860983867755,
"acc_stderr": 0.004599776866717491,
"acc_norm": 0.8705437163911571,
"acc_norm_stderr": 0.003350181812941604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469553,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469553
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944863,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944863
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.016568971233548606,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.016568971233548606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.02645722506781103,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.02645722506781103
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195974,
"mc2": 0.5970340702765861,
"mc2_stderr": 0.015540536389561436
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.0126615026634187
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_rwitz__go-bruins-v2 | [
"region:us"
] | 2023-12-10T05:39:00+00:00 | {"pretty_name": "Evaluation run of rwitz/go-bruins-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rwitz__go-bruins-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T05:42:16.717744](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz__go-bruins-v2/blob/main/results_2023-12-10T05-42-16.717744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6521685007083396,\n \"acc_stderr\": 0.03205721368340006,\n \"acc_norm\": 0.6521344188001463,\n \"acc_norm_stderr\": 0.032717447545898726,\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.017363844503195974,\n \"mc2\": 0.5970340702765861,\n \"mc2_stderr\": 0.015540536389561436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760424,\n \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.01341751914471641\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6937860983867755,\n \"acc_stderr\": 0.004599776866717491,\n \"acc_norm\": 0.8705437163911571,\n \"acc_norm_stderr\": 0.003350181812941604\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944863,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944863\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.016568971233548606,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.016568971233548606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.02645722506781103,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.02645722506781103\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.017363844503195974,\n \"mc2\": 0.5970340702765861,\n \"mc2_stderr\": 0.015540536389561436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \"acc_stderr\": 0.0126615026634187\n }\n}\n```", "repo_url": "https://huggingface.co/rwitz/go-bruins-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-36-09.275219.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-42-16.717744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["**/details_harness|winogrande|5_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["**/details_harness|winogrande|5_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T05-42-16.717744.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T05_36_09.275219", "path": ["results_2023-12-10T05-36-09.275219.parquet"]}, {"split": "2023_12_10T05_42_16.717744", "path": ["results_2023-12-10T05-42-16.717744.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T05-42-16.717744.parquet"]}]}]} | 2023-12-10T05:45:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rwitz/go-bruins-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model rwitz/go-bruins-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T05:42:16.717744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of rwitz/go-bruins-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-bruins-v2 on the Open LLM L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rwitz/go-bruins-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-br... | [
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rwitz/go-bruins-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model rwitz/go-bruins-v2 o... |
691e13d924106d57904ab820eac91361a27d12cc | # Dataset Card for "hf-stack-zyx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | update0909/hf-stack-zyx | [
"region:us"
] | 2023-12-10T05:47:53+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "string"}, {"name": "file_path", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 114334795, "num_examples": 7212}], "download_size": 38900746, "dataset_size": 114334795}} | 2023-12-10T05:56:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "hf-stack-zyx"
More Information needed | [
"# Dataset Card for \"hf-stack-zyx\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"hf-stack-zyx\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"hf-stack-zyx\"\n\nMore Information needed"
] |
0f1d6c355f1e6e2bf8c7ffeeb532ecab250c3ac9 | # Dataset Card for "MarcBotClips"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | GeneralRincewind/MarcBotClips | [
"region:us"
] | 2023-12-10T05:49:18+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 21226643625.208, "num_examples": 12064}], "download_size": 18110074182, "dataset_size": 21226643625.208}} | 2023-12-10T06:03:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "MarcBotClips"
More Information needed | [
"# Dataset Card for \"MarcBotClips\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MarcBotClips\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"MarcBotClips\"\n\nMore Information needed"
] |
b5bf143fe7ff42131d1cbe78fad4ce558cd1fd51 |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:55:07.023442](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat/blob/main/results_2023-12-10T05-55-07.023442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7741514926490987,
"acc_stderr": 0.027646135380835733,
"acc_norm": 0.7828326159595959,
"acc_norm_stderr": 0.02814394317924737,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5362104216200869,
"mc2_stderr": 0.01504184962981019
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559569,
"acc_norm": 0.8525194184425413,
"acc_norm_stderr": 0.003538596773704832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7555555555555555,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.7555555555555555,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474938,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474938
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.02675439134803976,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.02675439134803976
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.753968253968254,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.753968253968254,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.043758884927270585,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.043758884927270585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9258064516129032,
"acc_stderr": 0.01490952930054621,
"acc_norm": 0.9258064516129032,
"acc_norm_stderr": 0.01490952930054621
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6847290640394089,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.6847290640394089,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.0188526702349931,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.0188526702349931
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.01934807017439698,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.01934807017439698
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.0304821923951915,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.0304821923951915
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.02186325849485212,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.02186325849485212
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5496688741721855,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.5496688741721855,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848607,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0305467452649532,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0305467452649532
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658935,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.017676679991891632,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.017676679991891632
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.022683403691723312,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.022683403691723312
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.031766839486404054,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.031766839486404054
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872736,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872736
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442265,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442265
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.788826815642458,
"acc_stderr": 0.013650276794312199,
"acc_norm": 0.788826815642458,
"acc_norm_stderr": 0.013650276794312199
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.020519050342084726,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.020519050342084726
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438293,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6186440677966102,
"acc_stderr": 0.01240550940188812,
"acc_norm": 0.6186440677966102,
"acc_norm_stderr": 0.01240550940188812
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581767,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8316993464052288,
"acc_stderr": 0.01513580333869338,
"acc_norm": 0.8316993464052288,
"acc_norm_stderr": 0.01513580333869338
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098615,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5362104216200869,
"mc2_stderr": 0.01504184962981019
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855944
},
"harness|gsm8k|5": {
"acc": 0.44351781652767247,
"acc_stderr": 0.013684327592606165
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat | [
"region:us"
] | 2023-12-10T05:57:55+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T05:55:07.023442](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat/blob/main/results_2023-12-10T05-55-07.023442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7741514926490987,\n \"acc_stderr\": 0.027646135380835733,\n \"acc_norm\": 0.7828326159595959,\n \"acc_norm_stderr\": 0.02814394317924737,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5362104216200869,\n \"mc2_stderr\": 0.01504184962981019\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n \"acc_stderr\": 0.004749286071559569,\n \"acc_norm\": 0.8525194184425413,\n \"acc_norm_stderr\": 0.003538596773704832\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474938,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474938\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372277,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372277\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.02675439134803976,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.02675439134803976\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.753968253968254,\n \"acc_stderr\": 0.022182037202948365,\n \"acc_norm\": 0.753968253968254,\n \"acc_norm_stderr\": 0.022182037202948365\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n \"acc_stderr\": 0.043758884927270585,\n \"acc_norm\": 0.6031746031746031,\n \"acc_norm_stderr\": 0.043758884927270585\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9258064516129032,\n \"acc_stderr\": 0.01490952930054621,\n \"acc_norm\": 0.9258064516129032,\n \"acc_norm_stderr\": 0.01490952930054621\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6847290640394089,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.6847290640394089,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9242424242424242,\n \"acc_stderr\": 0.0188526702349931,\n \"acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.0188526702349931\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.01934807017439698,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.01934807017439698\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4925925925925926,\n \"acc_stderr\": 0.0304821923951915,\n \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.0304821923951915\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.02186325849485212,\n \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.02186325849485212\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5496688741721855,\n \"acc_stderr\": 0.04062290018683775,\n \"acc_norm\": 0.5496688741721855,\n \"acc_norm_stderr\": 0.04062290018683775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848607,\n \"acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848607\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0305467452649532,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0305467452649532\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658935,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658935\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891632,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891632\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723312,\n \"acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723312\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n \"acc_stderr\": 0.014450181176872736,\n \"acc_norm\": 0.9487179487179487,\n \"acc_norm_stderr\": 0.014450181176872736\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442265,\n \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442265\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.788826815642458,\n \"acc_stderr\": 0.013650276794312199,\n \"acc_norm\": 0.788826815642458,\n \"acc_norm_stderr\": 0.013650276794312199\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n \"acc_stderr\": 0.020519050342084726,\n \"acc_norm\": 0.8456591639871383,\n \"acc_norm_stderr\": 0.020519050342084726\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438293,\n \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6186440677966102,\n \"acc_stderr\": 0.01240550940188812,\n \"acc_norm\": 0.6186440677966102,\n \"acc_norm_stderr\": 0.01240550940188812\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581767,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581767\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8316993464052288,\n \"acc_stderr\": 0.01513580333869338,\n \"acc_norm\": 0.8316993464052288,\n \"acc_norm_stderr\": 0.01513580333869338\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098615,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098615\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5362104216200869,\n \"mc2_stderr\": 0.01504184962981019\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855944\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44351781652767247,\n \"acc_stderr\": 0.013684327592606165\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["**/details_harness|winogrande|5_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T05-55-07.023442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T05_55_07.023442", "path": ["results_2023-12-10T05-55-07.023442.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T05-55-07.023442.parquet"]}]}]} | 2023-12-10T05:58:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Llama-Q-FastChat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T05:55:07.023442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of... | [
6,
28,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ky... |
04decf3a90d3eb5a4930112aa574b9a908c4e141 |
# Dataset Card for Evaluation run of mncai/yi-34B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/yi-34B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/yi-34B-v2](https://huggingface.co/mncai/yi-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__yi-34B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:59:23.635398](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__yi-34B-v2/blob/main/results_2023-12-10T05-59-23.635398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7523453787674309,
"acc_stderr": 0.02848483810892476,
"acc_norm": 0.756411391315877,
"acc_norm_stderr": 0.029027731000189076,
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272175,
"mc2": 0.5733928094646895,
"mc2_stderr": 0.01509801265375318
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955007,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6523600876319459,
"acc_stderr": 0.004752476997887817,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585126
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474935,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474935
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.02389335183446432,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.02389335183446432
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774631,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774631
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284343,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284343
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8282051282051283,
"acc_stderr": 0.01912490360342356,
"acc_norm": 0.8282051282051283,
"acc_norm_stderr": 0.01912490360342356
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3962962962962963,
"acc_stderr": 0.029822619458534,
"acc_norm": 0.3962962962962963,
"acc_norm_stderr": 0.029822619458534
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673964,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356517,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876346,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8063583815028902,
"acc_stderr": 0.021274230317515547,
"acc_norm": 0.8063583815028902,
"acc_norm_stderr": 0.021274230317515547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7150837988826816,
"acc_stderr": 0.015096222302469792,
"acc_norm": 0.7150837988826816,
"acc_norm_stderr": 0.015096222302469792
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213512,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213512
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544543,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544543
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.624113475177305,
"acc_stderr": 0.028893955412115875,
"acc_norm": 0.624113475177305,
"acc_norm_stderr": 0.028893955412115875
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5984354628422425,
"acc_stderr": 0.01252031512014712,
"acc_norm": 0.5984354628422425,
"acc_norm_stderr": 0.01252031512014712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736833,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736833
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272175,
"mc2": 0.5733928094646895,
"mc2_stderr": 0.01509801265375318
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273759
},
"harness|gsm8k|5": {
"acc": 0.6497346474601972,
"acc_stderr": 0.013140409455571286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_mncai__yi-34B-v2 | [
"region:us"
] | 2023-12-10T06:02:12+00:00 | {"pretty_name": "Evaluation run of mncai/yi-34B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [mncai/yi-34B-v2](https://huggingface.co/mncai/yi-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__yi-34B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T05:59:23.635398](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__yi-34B-v2/blob/main/results_2023-12-10T05-59-23.635398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7523453787674309,\n \"acc_stderr\": 0.02848483810892476,\n \"acc_norm\": 0.756411391315877,\n \"acc_norm_stderr\": 0.029027731000189076,\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5733928094646895,\n \"mc2_stderr\": 0.01509801265375318\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955007,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6523600876319459,\n \"acc_stderr\": 0.004752476997887817,\n \"acc_norm\": 0.8500298745269866,\n \"acc_norm_stderr\": 0.0035631244274585126\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474935,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474935\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.02389335183446432,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.02389335183446432\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774631,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774631\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972592,\n \"acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972592\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284343,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284343\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8282051282051283,\n \"acc_stderr\": 0.01912490360342356,\n \"acc_norm\": 0.8282051282051283,\n \"acc_norm_stderr\": 0.01912490360342356\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3962962962962963,\n \"acc_stderr\": 0.029822619458534,\n \"acc_norm\": 0.3962962962962963,\n \"acc_norm_stderr\": 0.029822619458534\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673964,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n \"acc_stderr\": 0.025998379092356517,\n \"acc_norm\": 0.8161434977578476,\n \"acc_norm_stderr\": 0.025998379092356517\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.010648356301876346,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.010648356301876346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8063583815028902,\n \"acc_stderr\": 0.021274230317515547,\n \"acc_norm\": 0.8063583815028902,\n \"acc_norm_stderr\": 0.021274230317515547\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7150837988826816,\n \"acc_stderr\": 0.015096222302469792,\n \"acc_norm\": 0.7150837988826816,\n \"acc_norm_stderr\": 0.015096222302469792\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213512,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213512\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544543,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544543\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115875,\n \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115875\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5984354628422425,\n \"acc_stderr\": 0.01252031512014712,\n \"acc_norm\": 0.5984354628422425,\n \"acc_norm_stderr\": 0.01252031512014712\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02236867256288675,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02236867256288675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736833,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736833\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5733928094646895,\n \"mc2_stderr\": 0.01509801265375318\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273759\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \"acc_stderr\": 0.013140409455571286\n }\n}\n```", "repo_url": "https://huggingface.co/mncai/yi-34B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["**/details_harness|winogrande|5_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T05-59-23.635398.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T05_59_23.635398", "path": ["results_2023-12-10T05-59-23.635398.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T05-59-23.635398.parquet"]}]}]} | 2023-12-10T06:02:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mncai/yi-34B-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mncai/yi-34B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T05:59:23.635398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of mncai/yi-34B-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/yi-34B-v2 on the Open LLM Leaderb... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mncai/yi-34B-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/yi-34B-v... | [
6,
19,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mncai/yi-34B-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/yi-34B-v2 on the ... |
227f1e4394690659ed979e5524e0a9cb7cc85042 | # Dataset Card for "ApolloAuto-zyx-apollo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | update0909/ApolloAuto-zyx-apollo | [
"region:us"
] | 2023-12-10T06:05:43+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "string"}, {"name": "file_path", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 152781948, "num_examples": 17194}], "download_size": 46566010, "dataset_size": 152781948}} | 2023-12-10T06:59:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ApolloAuto-zyx-apollo"
More Information needed | [
"# Dataset Card for \"ApolloAuto-zyx-apollo\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ApolloAuto-zyx-apollo\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ApolloAuto-zyx-apollo\"\n\nMore Information needed"
] |
c78d0cfe76cb42007625177433bbff2aa4689921 | Welcome to the Furry Face Dataset! This dataset is used to fine-tune stable diffusion for human face to anthropomorphic animal face translation!
This is for CS 548 at SUNY Poly | DAura951/Furry-Face-Dataset | [
"region:us"
] | 2023-12-10T06:23:23+00:00 | {} | 2023-12-10T22:56:35+00:00 | [] | [] | TAGS
#region-us
| Welcome to the Furry Face Dataset! This dataset is used to fine-tune stable diffusion for human face to anthropomorphic animal face translation!
This is for CS 548 at SUNY Poly | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
1d3c95b41f7f87da40165fef4fa8db7896aed5a9 |
# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/servile-harpsichord-cdpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/servile-harpsichord-cdpo](https://huggingface.co/chargoddard/servile-harpsichord-cdpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T06:44:09.091422](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo/blob/main/results_2023-12-10T06-44-09.091422.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6467821760747017,
"acc_stderr": 0.032099406932013255,
"acc_norm": 0.6493833410875584,
"acc_norm_stderr": 0.032737739125074355,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195978,
"mc2": 0.6061030127349698,
"mc2_stderr": 0.015471882890395387
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839157,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.013706665975587331
},
"harness|hellaswag|10": {
"acc": 0.6618203545110536,
"acc_stderr": 0.004721231637092722,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580435
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880277,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126253,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126253
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889135,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195978,
"mc2": 0.6061030127349698,
"mc2_stderr": 0.015471882890395387
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.5708870356330553,
"acc_stderr": 0.013633369425647234
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo | [
"region:us"
] | 2023-12-10T06:47:01+00:00 | {"pretty_name": "Evaluation run of chargoddard/servile-harpsichord-cdpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/servile-harpsichord-cdpo](https://huggingface.co/chargoddard/servile-harpsichord-cdpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T06:44:09.091422](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__servile-harpsichord-cdpo/blob/main/results_2023-12-10T06-44-09.091422.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6467821760747017,\n \"acc_stderr\": 0.032099406932013255,\n \"acc_norm\": 0.6493833410875584,\n \"acc_norm_stderr\": 0.032737739125074355,\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.017363844503195978,\n \"mc2\": 0.6061030127349698,\n \"mc2_stderr\": 0.015471882890395387\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839157,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587331\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6618203545110536,\n \"acc_stderr\": 0.004721231637092722,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.0035454991695580435\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880277,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880277\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126253,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126253\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546837,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546837\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n \"acc_stderr\": 0.012712265105889135,\n \"acc_norm\": 0.45241199478487615,\n \"acc_norm_stderr\": 0.012712265105889135\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n \"mc1_stderr\": 0.017363844503195978,\n \"mc2\": 0.6061030127349698,\n \"mc2_stderr\": 0.015471882890395387\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5708870356330553,\n \"acc_stderr\": 0.013633369425647234\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/servile-harpsichord-cdpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|arc:challenge|25_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|gsm8k|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hellaswag|10_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["**/details_harness|winogrande|5_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T06-44-09.091422.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T06_44_09.091422", "path": ["results_2023-12-10T06-44-09.091422.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T06-44-09.091422.parquet"]}]}]} | 2023-12-10T06:47:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model chargoddard/servile-harpsichord-cdpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T06:44:09.091422(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/servil... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/servile-harpsichord-cdpo## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ch... |
538f7a39097dcad5007ed53a94fa2eb40f53f2e7 | # Dow30 Stock Prediction Dataset
## Overview
Welcome to the Dow30 Stock Prediction dataset! This dataset is designed to assist in predicting stock returns for companies in the Dow Jones Industrial Average (Dow30). It includes essential information about each company, such as news from the last two weeks, basic financial data, and stock prices over the same period.
## Dataset Structure
The dataset consists of the following columns:
1. **prompt:** Information about the company, including news from the last two weeks, basic financial data, and stock prices for the same period. The system prompt is generated using the code provided in the [FinGPT_Forecaster](https://github.com/AI4Finance-Foundation/FinGPT/blob/master/fingpt/FinGPT_Forecaster/prepare_data.ipynb) repository.
2. **answer:** Stock return predictions generated by ChatGPT.
3. **period:** Time period of the data, recorded on a weekly basis.
4. **label:** Indicates whether the stock is predicted to go up or down, along with the percentage change.
5. **symbol:** Stock symbol representing the company in the Dow Jones Industrial Average.
| descartes100/Dow30_stock_prediction | [
"region:us"
] | 2023-12-10T07:37:36+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "period", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "symbol", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3127735, "num_examples": 480}, {"name": "test", "num_bytes": 797367, "num_examples": 120}], "download_size": 1523163, "dataset_size": 3925102}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-10T08:04:49+00:00 | [] | [] | TAGS
#region-us
| # Dow30 Stock Prediction Dataset
## Overview
Welcome to the Dow30 Stock Prediction dataset! This dataset is designed to assist in predicting stock returns for companies in the Dow Jones Industrial Average (Dow30). It includes essential information about each company, such as news from the last two weeks, basic financial data, and stock prices over the same period.
## Dataset Structure
The dataset consists of the following columns:
1. prompt: Information about the company, including news from the last two weeks, basic financial data, and stock prices for the same period. The system prompt is generated using the code provided in the FinGPT_Forecaster repository.
2. answer: Stock return predictions generated by ChatGPT.
3. period: Time period of the data, recorded on a weekly basis.
4. label: Indicates whether the stock is predicted to go up or down, along with the percentage change.
5. symbol: Stock symbol representing the company in the Dow Jones Industrial Average.
| [
"# Dow30 Stock Prediction Dataset",
"## Overview\n\nWelcome to the Dow30 Stock Prediction dataset! This dataset is designed to assist in predicting stock returns for companies in the Dow Jones Industrial Average (Dow30). It includes essential information about each company, such as news from the last two weeks, b... | [
"TAGS\n#region-us \n",
"# Dow30 Stock Prediction Dataset",
"## Overview\n\nWelcome to the Dow30 Stock Prediction dataset! This dataset is designed to assist in predicting stock returns for companies in the Dow Jones Industrial Average (Dow30). It includes essential information about each company, such as news f... | [
6,
9,
71,
140
] | [
"passage: TAGS\n#region-us \n# Dow30 Stock Prediction Dataset## Overview\n\nWelcome to the Dow30 Stock Prediction dataset! This dataset is designed to assist in predicting stock returns for companies in the Dow Jones Industrial Average (Dow30). It includes essential information about each company, such as news from... |
6d01e20b1eea9507fa67911b25b3fb1cd3875f88 |
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/Deacon-34b-qlora-adapter
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-34b-qlora-adapter](https://huggingface.co/KnutJaegersberg/Deacon-34b-qlora-adapter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-qlora-adapter",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T07:35:32.492424](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-qlora-adapter/blob/main/results_2023-12-10T07-35-32.492424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7583327240683659,
"acc_stderr": 0.02818417949184259,
"acc_norm": 0.7633703000884471,
"acc_norm_stderr": 0.028709251183600685,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5621149830422679,
"mc2_stderr": 0.015167725368215625
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257179,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600938
},
"harness|hellaswag|10": {
"acc": 0.655646285600478,
"acc_stderr": 0.004741859753178431,
"acc_norm": 0.8556064528978291,
"acc_norm_stderr": 0.003507699935074239
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284808,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284808
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.03435568056047875,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.03435568056047875
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.027678452578212387,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.027678452578212387
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6613756613756614,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.6613756613756614,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8741935483870967,
"acc_stderr": 0.018865834288029997,
"acc_norm": 0.8741935483870967,
"acc_norm_stderr": 0.018865834288029997
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706473,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706473
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909042,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909042
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.020473233173551965,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.020473233173551965
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.030384169232350825,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.030384169232350825
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707952,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.03236585252602157,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.03236585252602157
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316952,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316952
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9240506329113924,
"acc_stderr": 0.017244633251065702,
"acc_norm": 0.9240506329113924,
"acc_norm_stderr": 0.017244633251065702
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.025135382356604227,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.025135382356604227
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.01745698787243619,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.01745698787243619
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9029374201787995,
"acc_stderr": 0.010586474712018302,
"acc_norm": 0.9029374201787995,
"acc_norm_stderr": 0.010586474712018302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8323699421965318,
"acc_stderr": 0.02011057991973484,
"acc_norm": 0.8323699421965318,
"acc_norm_stderr": 0.02011057991973484
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6458100558659218,
"acc_stderr": 0.01599564494729923,
"acc_norm": 0.6458100558659218,
"acc_norm_stderr": 0.01599564494729923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391888,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.018303868806891794,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.018303868806891794
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5971316818774446,
"acc_stderr": 0.012526955577118009,
"acc_norm": 0.5971316818774446,
"acc_norm_stderr": 0.012526955577118009
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02388688192244034,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02388688192244034
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262549,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262549
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824636,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5621149830422679,
"mc2_stderr": 0.015167725368215625
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838899
},
"harness|gsm8k|5": {
"acc": 0.6224412433661866,
"acc_stderr": 0.013353150666358532
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-qlora-adapter | [
"region:us"
] | 2023-12-10T07:38:22+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-34b-qlora-adapter](https://huggingface.co/KnutJaegersberg/Deacon-34b-qlora-adapter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-qlora-adapter\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T07:35:32.492424](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-qlora-adapter/blob/main/results_2023-12-10T07-35-32.492424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7583327240683659,\n \"acc_stderr\": 0.02818417949184259,\n \"acc_norm\": 0.7633703000884471,\n \"acc_norm_stderr\": 0.028709251183600685,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5621149830422679,\n \"mc2_stderr\": 0.015167725368215625\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257179,\n \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600938\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.655646285600478,\n \"acc_stderr\": 0.004741859753178431,\n \"acc_norm\": 0.8556064528978291,\n \"acc_norm_stderr\": 0.003507699935074239\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02628055093284808,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02628055093284808\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.03435568056047875,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.03435568056047875\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.027678452578212387,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.027678452578212387\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6613756613756614,\n \"acc_stderr\": 0.02437319786798306,\n \"acc_norm\": 0.6613756613756614,\n \"acc_norm_stderr\": 0.02437319786798306\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8741935483870967,\n \"acc_stderr\": 0.018865834288029997,\n \"acc_norm\": 0.8741935483870967,\n \"acc_norm_stderr\": 0.018865834288029997\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706473,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706473\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909042,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909042\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.020473233173551965,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.020473233173551965\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.030384169232350825,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.030384169232350825\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707952,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707952\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.03236585252602157,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.03236585252602157\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316952,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316952\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9240506329113924,\n \"acc_stderr\": 0.017244633251065702,\n \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.017244633251065702\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.01745698787243619,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.01745698787243619\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9029374201787995,\n \"acc_stderr\": 0.010586474712018302,\n \"acc_norm\": 0.9029374201787995,\n \"acc_norm_stderr\": 0.010586474712018302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8323699421965318,\n \"acc_stderr\": 0.02011057991973484,\n \"acc_norm\": 0.8323699421965318,\n \"acc_norm_stderr\": 0.02011057991973484\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6458100558659218,\n \"acc_stderr\": 0.01599564494729923,\n \"acc_norm\": 0.6458100558659218,\n \"acc_norm_stderr\": 0.01599564494729923\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n \"acc_stderr\": 0.020862388082391888,\n \"acc_norm\": 0.8392282958199357,\n \"acc_norm_stderr\": 0.020862388082391888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.018303868806891794,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.018303868806891794\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5971316818774446,\n \"acc_stderr\": 0.012526955577118009,\n \"acc_norm\": 0.5971316818774446,\n \"acc_norm_stderr\": 0.012526955577118009\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02388688192244034,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02388688192244034\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262549,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262549\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824636,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5621149830422679,\n \"mc2_stderr\": 0.015167725368215625\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6224412433661866,\n \"acc_stderr\": 0.013353150666358532\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deacon-34b-qlora-adapter", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|arc:challenge|25_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|gsm8k|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hellaswag|10_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T07-35-32.492424.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["**/details_harness|winogrande|5_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T07-35-32.492424.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T07_35_32.492424", "path": ["results_2023-12-10T07-35-32.492424.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T07-35-32.492424.parquet"]}]}]} | 2023-12-10T07:39:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-34b-qlora-adapter on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T07:35:32.492424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersber... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation ru... | [
6,
25,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-qlora-adapter## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... |
9b960801da721d25745d23b14cb24ff78398b785 |
# How to use
```python
import tarfile
from huggingface_hub import hf_hub_download
hf_dataset_identifier="aisuko/ucf101-subset"
filename="UCF101_subset.tar.gz"
file_path=hf_hub_download(repo_id=hf_dataset_identifier, filename=filename, repo_type="dataset")
import tarfile
with tarfile.open(file_path) as t:
t.extractall(".")
```
# Check the folder
```
UCF101_subset/
train/
BandMarching/
video_1.mp4
video_2.mp4
...
Archery
video_1.mp4
video_2.mp4
...
...
val/
BandMarching/
video_1.mp4
video_2.mp4
...
Archery
video_1.mp4
video_2.mp4
...
...
test/
BandMarching/
video_1.mp4
video_2.mp4
...
Archery
video_1.mp4
video_2.mp4
...
...
```
| aisuko/ucf101-subset | [
"task_categories:video-classification",
"license:apache-2.0",
"region:us"
] | 2023-12-10T07:53:48+00:00 | {"license": "apache-2.0", "task_categories": ["video-classification"]} | 2023-12-10T08:16:05+00:00 | [] | [] | TAGS
#task_categories-video-classification #license-apache-2.0 #region-us
|
# How to use
# Check the folder
| [
"# How to use",
"# Check the folder"
] | [
"TAGS\n#task_categories-video-classification #license-apache-2.0 #region-us \n",
"# How to use",
"# Check the folder"
] | [
25,
4,
4
] | [
"passage: TAGS\n#task_categories-video-classification #license-apache-2.0 #region-us \n# How to use# Check the folder"
] |
ec65ab9d274e10ac0ece627a2ea600e4e84fe429 | # Dataset Card for "pc9Cap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ArasAyen/pc9Cap | [
"region:us"
] | 2023-12-10T08:29:59+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 265997109.0, "num_examples": 302}], "download_size": 262523050, "dataset_size": 265997109.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T08:30:21+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pc9Cap"
More Information needed | [
"# Dataset Card for \"pc9Cap\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pc9Cap\"\n\nMore Information needed"
] | [
6,
13
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pc9Cap\"\n\nMore Information needed"
] |
a54a662ed39df02d2f36019662b1e0f6116a5e76 |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-200k-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T08:30:20.014698](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat/blob/main/results_2023-12-10T08-30-20.014698.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7630727247628006,
"acc_stderr": 0.028221206890446823,
"acc_norm": 0.770488792020382,
"acc_norm_stderr": 0.028732290582792492,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4838395775572536,
"mc2_stderr": 0.014874467350764172
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6467835092611034,
"acc_stderr": 0.004769924131304649,
"acc_norm": 0.8445528779127663,
"acc_norm_stderr": 0.003615898928269288
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.03885004245800253,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.03885004245800253
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.02407999513006225,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.02407999513006225
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7380952380952381,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.7380952380952381,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034527,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034527
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865397,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865397
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.0163199507007674,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.0163199507007674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295127,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295127
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.0196716324131003,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.0196716324131003
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683258,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683258
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848607,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.032472243899179465,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.032472243899179465
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.045479609997643757,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.045479609997643757
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253867,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253867
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9144316730523627,
"acc_stderr": 0.010002965568647286,
"acc_norm": 0.9144316730523627,
"acc_norm_stderr": 0.010002965568647286
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.020903975842083027,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.020903975842083027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7262569832402235,
"acc_stderr": 0.014912413096372432,
"acc_norm": 0.7262569832402235,
"acc_norm_stderr": 0.014912413096372432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.01970403918385981,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.01970403918385981
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790906,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790906
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614095,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6173402868318123,
"acc_stderr": 0.01241359588289327,
"acc_norm": 0.6173402868318123,
"acc_norm_stderr": 0.01241359588289327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650163,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4838395775572536,
"mc2_stderr": 0.014874467350764172
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.01108253884749189
},
"harness|gsm8k|5": {
"acc": 0.514783927217589,
"acc_stderr": 0.0137664630507876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat | [
"region:us"
] | 2023-12-10T08:33:11+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-200k-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T08:30:20.014698](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-200k-Q-FastChat/blob/main/results_2023-12-10T08-30-20.014698.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7630727247628006,\n \"acc_stderr\": 0.028221206890446823,\n \"acc_norm\": 0.770488792020382,\n \"acc_norm_stderr\": 0.028732290582792492,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4838395775572536,\n \"mc2_stderr\": 0.014874467350764172\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910471,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6467835092611034,\n \"acc_stderr\": 0.004769924131304649,\n \"acc_norm\": 0.8445528779127663,\n \"acc_norm_stderr\": 0.003615898928269288\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.03885004245800253,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.03885004245800253\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.02407999513006225,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.02407999513006225\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496228,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496228\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7380952380952381,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.7380952380952381,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034527,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034527\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865397,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865397\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.0163199507007674,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.0163199507007674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295127,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295127\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.0196716324131003,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.0196716324131003\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683258,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683258\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848607,\n \"acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848607\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.032472243899179465,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.032472243899179465\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.045479609997643757,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.045479609997643757\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253867,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253867\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n \"acc_stderr\": 0.010002965568647286,\n \"acc_norm\": 0.9144316730523627,\n \"acc_norm_stderr\": 0.010002965568647286\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.020903975842083027,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.020903975842083027\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7262569832402235,\n \"acc_stderr\": 0.014912413096372432,\n \"acc_norm\": 0.7262569832402235,\n \"acc_norm_stderr\": 0.014912413096372432\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.01970403918385981,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.01970403918385981\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6173402868318123,\n \"acc_stderr\": 0.01241359588289327,\n \"acc_norm\": 0.6173402868318123,\n \"acc_norm_stderr\": 0.01241359588289327\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650163,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4838395775572536,\n \"mc2_stderr\": 0.014874467350764172\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.01108253884749189\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.514783927217589,\n \"acc_stderr\": 0.0137664630507876\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-200k-Q-FastChat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|arc:challenge|25_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|gsm8k|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hellaswag|10_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["**/details_harness|winogrande|5_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T08-30-20.014698.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T08_30_20.014698", "path": ["results_2023-12-10T08-30-20.014698.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T08-30-20.014698.parquet"]}]}]} | 2023-12-10T08:33:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-200k-Q-FastChat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T08:30:20.014698(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of ... | [
6,
27,
31,
176,
68,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-200k-Q-FastChat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyu... |
f3690806e872b686cb26933b973d57d675496fe6 | # Glaive Code Assistant
[Glaive Code Assistant dataset](https://huggingface.co/datasets/glaiveai/glaive-code-assistant) formatted for training assistant models with the following prompt template:
```
<s>[INST] {question} [/INST] {answer} </s>
```
Trained model can be prompted in Llama style:
```
<s>[INST] {{ user_msg }} [/INST]
``` | mwitiderrick/glaive-code-assistant | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-10T08:48:53+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "Glaive Code Assistant", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 210090644, "num_examples": 136109}], "download_size": 100891258, "dataset_size": 210090644}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T09:02:55+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us
| # Glaive Code Assistant
Glaive Code Assistant dataset formatted for training assistant models with the following prompt template:
Trained model can be prompted in Llama style:
| [
"# Glaive Code Assistant\nGlaive Code Assistant dataset formatted for training assistant models with the following prompt template: \n\n\nTrained model can be prompted in Llama style:"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us \n",
"# Glaive Code Assistant\nGlaive Code Assistant dataset formatted for training assistant models with the following prompt template: \n\n\nTrained model can be prompted in Llama style:"
] | [
41,
37
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us \n# Glaive Code Assistant\nGlaive Code Assistant dataset formatted for training assistant models with the following prompt template: \n\n\nTrained model can be prompted in Llama style:"
] |
413a2b0341d2d1bd3707732c4ac23593008427a4 | # Dataset Card for "rapidapi-example-responses-tokenized-phi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-tokenized-phi | [
"region:us"
] | 2023-12-10T09:02:39+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 168469345.01807085, "num_examples": 45271}, {"name": "test", "num_bytes": 18722123.98192915, "num_examples": 5031}], "download_size": 65907419, "dataset_size": 187191469.0}} | 2023-12-10T09:02:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-tokenized-phi"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-tokenized-phi\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-tokenized-phi\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-tokenized-phi\"\n\nMore Information needed"
] |
0a610511454a0562db5107a3f4e0606f962a07a1 |
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-yi-34b-200k](https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T09:19:14.695653](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-10T09-19-14.695653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5443333155719463,
"acc_stderr": 0.03403073973019475,
"acc_norm": 0.5545570631884628,
"acc_norm_stderr": 0.034865135931915724,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.45931787186509654,
"mc2_stderr": 0.0156737639267665
},
"harness|arc:challenge|25": {
"acc": 0.3924914675767918,
"acc_stderr": 0.014269634635670714,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.5135431189006174,
"acc_stderr": 0.004987950663406538,
"acc_norm": 0.6818362875921131,
"acc_norm_stderr": 0.00464811532232878
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273956,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273956
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.02529460802398648,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.02529460802398648
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478464,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478464
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098295,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098295
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.047928981709070624,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.047928981709070624
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.04865777570410769,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.04865777570410769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303125,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303125
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.01598420454526857,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.01598420454526857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159628,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159628
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806178,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.01264536143511522,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.01264536143511522
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.02020665318788478,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.02020665318788478
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.45931787186509654,
"mc2_stderr": 0.0156737639267665
},
"harness|winogrande|5": {
"acc": 0.6456195737963694,
"acc_stderr": 0.013443314368356088
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.00520951628307378
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k | [
"region:us"
] | 2023-12-10T09:22:04+00:00 | {"pretty_name": "Evaluation run of ehartford/dolphin-2.2-yi-34b-200k", "dataset_summary": "Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-yi-34b-200k](https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T09:19:14.695653](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-10T09-19-14.695653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5443333155719463,\n \"acc_stderr\": 0.03403073973019475,\n \"acc_norm\": 0.5545570631884628,\n \"acc_norm_stderr\": 0.034865135931915724,\n \"mc1\": 0.2839657282741738,\n \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.45931787186509654,\n \"mc2_stderr\": 0.0156737639267665\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3924914675767918,\n \"acc_stderr\": 0.014269634635670714,\n \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326023\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5135431189006174,\n \"acc_stderr\": 0.004987950663406538,\n \"acc_norm\": 0.6818362875921131,\n \"acc_norm_stderr\": 0.00464811532232878\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273956,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273956\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.02529460802398648,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.02529460802398648\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478464,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478464\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.726605504587156,\n \"acc_stderr\": 0.019109299846098295,\n \"acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098295\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.047928981709070624,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.047928981709070624\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.04865777570410769,\n \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.04865777570410769\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303125,\n \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303125\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n \"acc_stderr\": 0.01598420454526857,\n \"acc_norm\": 0.35307262569832404,\n \"acc_norm_stderr\": 0.01598420454526857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159628,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159628\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592477,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592477\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806178,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806178\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n \"acc_stderr\": 0.01264536143511522,\n \"acc_norm\": 0.4302477183833116,\n \"acc_norm_stderr\": 0.01264536143511522\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.02020665318788478,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.02020665318788478\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824565,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824565\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.45931787186509654,\n \"mc2_stderr\": 0.0156737639267665\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6456195737963694,\n \"acc_stderr\": 0.013443314368356088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \"acc_stderr\": 0.00520951628307378\n }\n}\n```", "repo_url": "https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|arc:challenge|25_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|gsm8k|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hellaswag|10_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["**/details_harness|winogrande|5_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T09-19-14.695653.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T09_19_14.695653", "path": ["results_2023-12-10T09-19-14.695653.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T09-19-14.695653.parquet"]}]}]} | 2023-12-10T09:22:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ehartford/dolphin-2.2-yi-34b-200k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T09:19:14.695653(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/dolphin-2.2... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mo... | [
6,
24,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehart... |
e6e5c42f5d2c478ac178ba726ae80dacc5fde2c1 | THIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE,
WHICH I EDITED IN SUCH AWAY, DO NOT CLAIM THIS IS YOURS! GIVE CREDIT TO: LAYNZ28 AND KALOMAZE
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
THIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE, WHICH I EDITED IN SUCH A WAY, DO NOT CLAIM THIS IS YOURS!
GIVE CREDIT TO: LAYNZ28 AND KALOMAZE
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **License:** [mit] | Lask8/Advanced-Mangio-RVC-Fork | [
"license:mit",
"music",
"region:us"
] | 2023-12-10T10:14:44+00:00 | {"license": "mit", "pretty_name": "mangi", "tags": ["music"]} | 2023-12-10T10:21:05+00:00 | [] | [] | TAGS
#license-mit #music #region-us
| THIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE,
WHICH I EDITED IN SUCH AWAY, DO NOT CLAIM THIS IS YOURS! GIVE CREDIT TO: LAYNZ28 AND KALOMAZE
---
# Dataset Card for Dataset Name
THIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE, WHICH I EDITED IN SUCH A WAY, DO NOT CLAIM THIS IS YOURS!
GIVE CREDIT TO: LAYNZ28 AND KALOMAZE
## Dataset Details
### Dataset Description
- License: [mit] | [
"# Dataset Card for Dataset Name\n\n\n\nTHIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE, WHICH I EDITED IN SUCH A WAY, DO NOT CLAIM THIS IS YOURS! \nGIVE CREDIT TO: LAYNZ28 AND KALOMAZE",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- License: [mit]"
] | [
"TAGS\n#license-mit #music #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nTHIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE, WHICH I EDITED IN SUCH A WAY, DO NOT CLAIM THIS IS YOURS! \nGIVE CREDIT TO: LAYNZ28 AND KALOMAZE",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- License: [mit]"
] | [
13,
66,
4,
11
] | [
"passage: TAGS\n#license-mit #music #region-us \n# Dataset Card for Dataset Name\n\n\n\nTHIS DATASET IS NOT MINE, IT IS OWNED BY KALOMAZE, WHICH I EDITED IN SUCH A WAY, DO NOT CLAIM THIS IS YOURS! \nGIVE CREDIT TO: LAYNZ28 AND KALOMAZE## Dataset Details### Dataset Description\n\n\n\n\n\n- License: [mit]"
] |
0b5545cf6de7b1c302f8314b82e41f45b58bfb66 |
# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DopeorNope/COKAL-v1-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DopeorNope/COKAL-v1-70B](https://huggingface.co/DopeorNope/COKAL-v1-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:21:56.669760](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B/blob/main/results_2023-12-10T10-21-56.669760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6806080675011864,
"acc_stderr": 0.031026141939535783,
"acc_norm": 0.6871684287339627,
"acc_norm_stderr": 0.03163298834751675,
"mc1": 0.609547123623011,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.7279131434968619,
"mc2_stderr": 0.012814436118254086
},
"harness|arc:challenge|25": {
"acc": 0.8583617747440273,
"acc_stderr": 0.010189361609566652,
"acc_norm": 0.8745733788395904,
"acc_norm_stderr": 0.009678644555462999
},
"harness|hellaswag|10": {
"acc": 0.6278629755028878,
"acc_stderr": 0.004823867761332464,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253837,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253837
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857733,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857733
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.01332134844761175,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.01332134844761175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316935,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758538,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741617,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741617
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6078212290502794,
"acc_stderr": 0.016329061073207453,
"acc_norm": 0.6078212290502794,
"acc_norm_stderr": 0.016329061073207453
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7901234567901234,
"acc_stderr": 0.02265834408598137,
"acc_norm": 0.7901234567901234,
"acc_norm_stderr": 0.02265834408598137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6114732724902217,
"acc_stderr": 0.012448817838292376,
"acc_norm": 0.6114732724902217,
"acc_norm_stderr": 0.012448817838292376
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7467320261437909,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.7467320261437909,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.609547123623011,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.7279131434968619,
"mc2_stderr": 0.012814436118254086
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.39272175890826383,
"acc_stderr": 0.013451745349586566
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B | [
"region:us"
] | 2023-12-10T10:24:20+00:00 | {"pretty_name": "Evaluation run of DopeorNope/COKAL-v1-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [DopeorNope/COKAL-v1-70B](https://huggingface.co/DopeorNope/COKAL-v1-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T10:21:56.669760](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__COKAL-v1-70B/blob/main/results_2023-12-10T10-21-56.669760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6806080675011864,\n \"acc_stderr\": 0.031026141939535783,\n \"acc_norm\": 0.6871684287339627,\n \"acc_norm_stderr\": 0.03163298834751675,\n \"mc1\": 0.609547123623011,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.7279131434968619,\n \"mc2_stderr\": 0.012814436118254086\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.8583617747440273,\n \"acc_stderr\": 0.010189361609566652,\n \"acc_norm\": 0.8745733788395904,\n \"acc_norm_stderr\": 0.009678644555462999\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6278629755028878,\n \"acc_stderr\": 0.004823867761332464,\n \"acc_norm\": 0.8329018123879706,\n \"acc_norm_stderr\": 0.0037230107458783956\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253837,\n \"acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253837\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857733,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857733\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863804,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863804\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761175,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761175\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n \"acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316935,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316935\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758538,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758538\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594626,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594626\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741617,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741617\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6078212290502794,\n \"acc_stderr\": 0.016329061073207453,\n \"acc_norm\": 0.6078212290502794,\n \"acc_norm_stderr\": 0.016329061073207453\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.02265834408598137,\n \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.02265834408598137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6114732724902217,\n \"acc_stderr\": 0.012448817838292376,\n \"acc_norm\": 0.6114732724902217,\n \"acc_norm_stderr\": 0.012448817838292376\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201264,\n \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201264\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7467320261437909,\n \"acc_stderr\": 0.01759348689536683,\n \"acc_norm\": 0.7467320261437909,\n \"acc_norm_stderr\": 0.01759348689536683\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.609547123623011,\n \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.7279131434968619,\n \"mc2_stderr\": 0.012814436118254086\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39272175890826383,\n \"acc_stderr\": 0.013451745349586566\n }\n}\n```", "repo_url": "https://huggingface.co/DopeorNope/COKAL-v1-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["**/details_harness|winogrande|5_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T10-21-56.669760.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T10_21_56.669760", "path": ["results_2023-12-10T10-21-56.669760.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T10-21-56.669760.parquet"]}]}]} | 2023-12-10T10:25:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model DopeorNope/COKAL-v1-70B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T10:21:56.669760(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DopeorNope/COKAL-v1-70B on the ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Dopeor... | [
6,
22,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DopeorNope/COKAL-v1-70B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DopeorNope/COKA... |
7cd825cf1090ce0505126ef2de14b77ca408bce5 |
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:23:58.856045](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties/blob/main/results_2023-12-10T10-23-58.856045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7470453642761706,
"acc_stderr": 0.028619765288934736,
"acc_norm": 0.7535136424409922,
"acc_norm_stderr": 0.02914053190348252,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5283809284788162,
"mc2_stderr": 0.01556812706457422
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6541525592511452,
"acc_stderr": 0.0047467168057357635,
"acc_norm": 0.8499302927703645,
"acc_norm_stderr": 0.003564098420387773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.61,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6587301587301587,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.6587301587301587,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.017776778700485177,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.017776778700485177
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527048,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527048
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7846153846153846,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.7846153846153846,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.040622900186837764,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.040622900186837764
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769567,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769567
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744632,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744632
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.02783991527833965,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.02783991527833965
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194165,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.01064835630187633,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.01064835630187633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7284916201117319,
"acc_stderr": 0.014874252168095268,
"acc_norm": 0.7284916201117319,
"acc_norm_stderr": 0.014874252168095268
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108402,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108402
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.01993508609214988,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.01993508609214988
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.028663820147199485,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.028663820147199485
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5893089960886571,
"acc_stderr": 0.012564871542534356,
"acc_norm": 0.5893089960886571,
"acc_norm_stderr": 0.012564871542534356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581795,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581795
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659386,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659386
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5283809284788162,
"mc2_stderr": 0.01556812706457422
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386776
},
"harness|gsm8k|5": {
"acc": 0.5405610310841547,
"acc_stderr": 0.013727093010429785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties | [
"region:us"
] | 2023-12-10T10:26:49+00:00 | {"pretty_name": "Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T10:23:58.856045](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties/blob/main/results_2023-12-10T10-23-58.856045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7470453642761706,\n \"acc_stderr\": 0.028619765288934736,\n \"acc_norm\": 0.7535136424409922,\n \"acc_norm_stderr\": 0.02914053190348252,\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5283809284788162,\n \"mc2_stderr\": 0.01556812706457422\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6541525592511452,\n \"acc_stderr\": 0.0047467168057357635,\n \"acc_norm\": 0.8499302927703645,\n \"acc_norm_stderr\": 0.003564098420387773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6587301587301587,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.6587301587301587,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n \"acc_stderr\": 0.017776778700485177,\n \"acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.017776778700485177\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527048,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527048\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7846153846153846,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.7846153846153846,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.040622900186837764,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.040622900186837764\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769567,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769567\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744632,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744632\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.02783991527833965,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.02783991527833965\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.01872430174194165,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.01872430174194165\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.01064835630187633,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.01064835630187633\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7284916201117319,\n \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.7284916201117319,\n \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108402,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108402\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.01993508609214988,\n \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.01993508609214988\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199485,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199485\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5893089960886571,\n \"acc_stderr\": 0.012564871542534356,\n \"acc_norm\": 0.5893089960886571,\n \"acc_norm_stderr\": 0.012564871542534356\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581795,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581795\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659386,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659386\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5283809284788162,\n \"mc2_stderr\": 0.01556812706457422\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5405610310841547,\n \"acc_stderr\": 0.013727093010429785\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["**/details_harness|winogrande|5_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T10-23-58.856045.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T10_23_58.856045", "path": ["results_2023-12-10T10-23-58.856045.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T10-23-58.856045.parquet"]}]}]} | 2023-12-10T10:27:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T10:23:58.856045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during t... | [
6,
39,
31,
188,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evalua... |
d61fd51276cce0c553121ebf998e68e2e66b5352 |
## Dataset Description
A dataset for Chinese Sentiment-Analyze
Merged two datasets
- Weibo-Sentiment
- Shopping-Review | t1annnnn/Chinese_sentimentAnalyze | [
"license:mit",
"region:us"
] | 2023-12-10T10:32:28+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21107188, "num_examples": 148036}, {"name": "validation", "num_bytes": 2327791, "num_examples": 16449}, {"name": "test", "num_bytes": 2615618, "num_examples": 18277}], "download_size": 20038622, "dataset_size": 26050597}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-30T06:51:16+00:00 | [] | [] | TAGS
#license-mit #region-us
|
## Dataset Description
A dataset for Chinese Sentiment-Analyze
Merged two datasets
- Weibo-Sentiment
- Shopping-Review | [
"## Dataset Description\n\nA dataset for Chinese Sentiment-Analyze\n\nMerged two datasets\n- Weibo-Sentiment\n- Shopping-Review"
] | [
"TAGS\n#license-mit #region-us \n",
"## Dataset Description\n\nA dataset for Chinese Sentiment-Analyze\n\nMerged two datasets\n- Weibo-Sentiment\n- Shopping-Review"
] | [
11,
33
] | [
"passage: TAGS\n#license-mit #region-us \n## Dataset Description\n\nA dataset for Chinese Sentiment-Analyze\n\nMerged two datasets\n- Weibo-Sentiment\n- Shopping-Review"
] |
02028560bc3ee7eee4b48e2f96efb2d7772e0643 |
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-llama-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:47:40.022995](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0/blob/main/results_2023-12-10T10-47-40.022995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43945994951550066,
"acc_stderr": 0.034385529407471936,
"acc_norm": 0.4442918982351828,
"acc_norm_stderr": 0.035190222707291795,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.3908033560283727,
"mc2_stderr": 0.013656125379191442
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.014600132075947087,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866978
},
"harness|hellaswag|10": {
"acc": 0.5789683330013942,
"acc_stderr": 0.0049271558825981845,
"acc_norm": 0.7775343557060347,
"acc_norm_stderr": 0.0041505226302310265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02201908001221789,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02201908001221789
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.02815603653823321,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.02815603653823321
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033158,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033158
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.021122903208602585,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.021122903208602585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.02623287897149166,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.02623287897149166
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524867,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524867
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436972,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436972
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340275,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340275
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6002554278416348,
"acc_stderr": 0.017516847907053282,
"acc_norm": 0.6002554278416348,
"acc_norm_stderr": 0.017516847907053282
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652308,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652308
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.027820214158594363,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.027820214158594363
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.02788913930053478,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.02788913930053478
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32920469361147325,
"acc_stderr": 0.012002091666902297,
"acc_norm": 0.32920469361147325,
"acc_norm_stderr": 0.012002091666902297
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928554,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.363265306122449,
"acc_stderr": 0.03078905113903081,
"acc_norm": 0.363265306122449,
"acc_norm_stderr": 0.03078905113903081
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148128,
"mc2": 0.3908033560283727,
"mc2_stderr": 0.013656125379191442
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683965
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0 | [
"region:us"
] | 2023-12-10T10:50:45+00:00 | {"pretty_name": "Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-llama-v2.0](https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T10:47:40.022995](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama-v2.0/blob/main/results_2023-12-10T10-47-40.022995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43945994951550066,\n \"acc_stderr\": 0.034385529407471936,\n \"acc_norm\": 0.4442918982351828,\n \"acc_norm_stderr\": 0.035190222707291795,\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.3908033560283727,\n \"mc2_stderr\": 0.013656125379191442\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.014600132075947087,\n \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5789683330013942,\n \"acc_stderr\": 0.0049271558825981845,\n \"acc_norm\": 0.7775343557060347,\n \"acc_norm_stderr\": 0.0041505226302310265\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02201908001221789,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02201908001221789\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033158,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.024838811988033158\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5853211009174312,\n \"acc_stderr\": 0.021122903208602585,\n \"acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.021122903208602585\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.02623287897149166,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.02623287897149166\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524867,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524867\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693334,\n \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693334\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436972,\n \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436972\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n \"acc_stderr\": 0.030463656747340275,\n \"acc_norm\": 0.6837606837606838,\n \"acc_norm_stderr\": 0.030463656747340275\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6002554278416348,\n \"acc_stderr\": 0.017516847907053282,\n \"acc_norm\": 0.6002554278416348,\n \"acc_norm_stderr\": 0.017516847907053282\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.02690784985628254,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.02690784985628254\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.027820214158594363,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.027820214158594363\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053478,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053478\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32920469361147325,\n \"acc_stderr\": 0.012002091666902297,\n \"acc_norm\": 0.32920469361147325,\n \"acc_norm_stderr\": 0.012002091666902297\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928554,\n \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.363265306122449,\n \"acc_stderr\": 0.03078905113903081,\n \"acc_norm\": 0.363265306122449,\n \"acc_norm_stderr\": 0.03078905113903081\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148128,\n \"mc2\": 0.3908033560283727,\n \"mc2_stderr\": 0.013656125379191442\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \"acc_stderr\": 0.008238371412683965\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/finetuned-llama-v2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["**/details_harness|winogrande|5_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T10-47-40.022995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T10_47_40.022995", "path": ["results_2023-12-10T10-47-40.022995.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T10-47-40.022995.parquet"]}]}]} | 2023-12-10T10:51:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-llama-v2.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T10:47:40.022995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzh... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run ... | [
6,
26,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama-v2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ... |
2f983e01d7d420ecd751b54a13486709e9849025 |
# Dataset Card for Evaluation run of meta-math/MetaMath-Llemma-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-math/MetaMath-Llemma-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-math/MetaMath-Llemma-7B](https://huggingface.co/meta-math/MetaMath-Llemma-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:48:07.737490](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B/blob/main/results_2023-12-10T10-48-07.737490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4805727831472479,
"acc_stderr": 0.03501873176922748,
"acc_norm": 0.47876803306000837,
"acc_norm_stderr": 0.03574673517078834,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.39610018025256144,
"mc2_stderr": 0.015159247351087708
},
"harness|arc:challenge|25": {
"acc": 0.439419795221843,
"acc_stderr": 0.014503747823580125,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.01457558392201967
},
"harness|hellaswag|10": {
"acc": 0.4731129257120096,
"acc_stderr": 0.004982561815214125,
"acc_norm": 0.6169089822744473,
"acc_norm_stderr": 0.004851466623601442
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47924528301886793,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.47924528301886793,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.03590910952235524,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.03590910952235524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6091743119266055,
"acc_stderr": 0.02092005834611106,
"acc_norm": 0.6091743119266055,
"acc_norm_stderr": 0.02092005834611106
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.03509143375606785,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.03509143375606785
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5654008438818565,
"acc_stderr": 0.03226759995510145,
"acc_norm": 0.5654008438818565,
"acc_norm_stderr": 0.03226759995510145
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068384,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068384
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.039158572914369714,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.039158572914369714
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041697,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041697
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.561941251596424,
"acc_stderr": 0.01774223223825723,
"acc_norm": 0.561941251596424,
"acc_norm_stderr": 0.01774223223825723
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.026918645383239022,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.026918645383239022
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998893,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998893
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.028614624752805434,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.028614624752805434
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4983922829581994,
"acc_stderr": 0.02839794490780661,
"acc_norm": 0.4983922829581994,
"acc_norm_stderr": 0.02839794490780661
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3396349413298566,
"acc_stderr": 0.012095592506931967,
"acc_norm": 0.3396349413298566,
"acc_norm_stderr": 0.012095592506931967
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.02993534270787775,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.02993534270787775
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.39610018025256144,
"mc2_stderr": 0.015159247351087708
},
"harness|winogrande|5": {
"acc": 0.6274664561957379,
"acc_stderr": 0.013588173888522445
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.013437829864668582
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B | [
"region:us"
] | 2023-12-10T10:51:06+00:00 | {"pretty_name": "Evaluation run of meta-math/MetaMath-Llemma-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [meta-math/MetaMath-Llemma-7B](https://huggingface.co/meta-math/MetaMath-Llemma-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T10:48:07.737490](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B/blob/main/results_2023-12-10T10-48-07.737490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4805727831472479,\n \"acc_stderr\": 0.03501873176922748,\n \"acc_norm\": 0.47876803306000837,\n \"acc_norm_stderr\": 0.03574673517078834,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.39610018025256144,\n \"mc2_stderr\": 0.015159247351087708\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.439419795221843,\n \"acc_stderr\": 0.014503747823580125,\n \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.01457558392201967\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4731129257120096,\n \"acc_stderr\": 0.004982561815214125,\n \"acc_norm\": 0.6169089822744473,\n \"acc_norm_stderr\": 0.004851466623601442\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47924528301886793,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.47924528301886793,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944423,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944423\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6091743119266055,\n \"acc_stderr\": 0.02092005834611106,\n \"acc_norm\": 0.6091743119266055,\n \"acc_norm_stderr\": 0.02092005834611106\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5049019607843137,\n \"acc_stderr\": 0.03509143375606785,\n \"acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.03509143375606785\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5654008438818565,\n \"acc_stderr\": 0.03226759995510145,\n \"acc_norm\": 0.5654008438818565,\n \"acc_norm_stderr\": 0.03226759995510145\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068384,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068384\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.039158572914369714,\n \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.039158572914369714\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041697,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041697\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.561941251596424,\n \"acc_stderr\": 0.01774223223825723,\n \"acc_norm\": 0.561941251596424,\n \"acc_norm_stderr\": 0.01774223223825723\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.026918645383239022,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.026918645383239022\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998893,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998893\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.028614624752805434,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.028614624752805434\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4983922829581994,\n \"acc_stderr\": 0.02839794490780661,\n \"acc_norm\": 0.4983922829581994,\n \"acc_norm_stderr\": 0.02839794490780661\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.02766713856942271,\n \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.02766713856942271\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3396349413298566,\n \"acc_stderr\": 0.012095592506931967,\n \"acc_norm\": 0.3396349413298566,\n \"acc_norm_stderr\": 0.012095592506931967\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.02993534270787775,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.02993534270787775\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4068627450980392,\n \"acc_stderr\": 0.019873802005061177,\n \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.019873802005061177\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.39610018025256144,\n \"mc2_stderr\": 0.015159247351087708\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6274664561957379,\n \"acc_stderr\": 0.013588173888522445\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \"acc_stderr\": 0.013437829864668582\n }\n}\n```", "repo_url": "https://huggingface.co/meta-math/MetaMath-Llemma-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-48-07.737490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["**/details_harness|winogrande|5_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T10-48-07.737490.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T10_48_07.737490", "path": ["results_2023-12-10T10-48-07.737490.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T10-48-07.737490.parquet"]}]}]} | 2023-12-10T10:51:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of meta-math/MetaMath-Llemma-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model meta-math/MetaMath-Llemma-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T10:48:07.737490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of meta-math/MetaMath-Llemma-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model meta-math/MetaMath-Llemma-... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of meta-math/MetaMath-Llemma-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model m... | [
6,
22,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of meta-math/MetaMath-Llemma-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model meta-math/... |
3d480fc1c44e77dee170bacd70d8764879dbb75a |
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:09:27.293582](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity/blob/main/results_2023-12-10T11-09-27.293582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687816232153334,
"acc_stderr": 0.027928447952356685,
"acc_norm": 0.7740918392624115,
"acc_norm_stderr": 0.028444170747139307,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5784493836096679,
"mc2_stderr": 0.015412397526106352
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600933,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.663114917347142,
"acc_stderr": 0.00471679287443321,
"acc_norm": 0.857697669786895,
"acc_norm_stderr": 0.0034864596026062417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062246,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062246
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8,
"acc_stderr": 0.0261488180184245,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0261488180184245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04434600701584925,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04434600701584925
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7116402116402116,
"acc_stderr": 0.023330654054535892,
"acc_norm": 0.7116402116402116,
"acc_norm_stderr": 0.023330654054535892
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.01656575466827098,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.01656575466827098
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.02602465765165619,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.02602465765165619
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01699999492742161,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01699999492742161
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.019348070174396985,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.019348070174396985
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.030401786406101507,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.030401786406101507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.022448264476832593,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.022448264476832593
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9411764705882353,
"acc_stderr": 0.016514409561025796,
"acc_norm": 0.9411764705882353,
"acc_norm_stderr": 0.016514409561025796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9074074074074074,
"acc_stderr": 0.02802188803860944,
"acc_norm": 0.9074074074074074,
"acc_norm_stderr": 0.02802188803860944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7642458100558659,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.7642458100558659,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.02167005888551079,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.02167005888551079
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957187,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6121251629726207,
"acc_stderr": 0.012444998309675633,
"acc_norm": 0.6121251629726207,
"acc_norm_stderr": 0.012444998309675633
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010112997,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010112997
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.014957635756617647,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.014957635756617647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916633,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5784493836096679,
"mc2_stderr": 0.015412397526106352
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838906
},
"harness|gsm8k|5": {
"acc": 0.6133434420015162,
"acc_stderr": 0.01341395509596531
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity | [
"region:us"
] | 2023-12-10T11:12:18+00:00 | {"pretty_name": "Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T11:09:27.293582](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity/blob/main/results_2023-12-10T11-09-27.293582.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687816232153334,\n \"acc_stderr\": 0.027928447952356685,\n \"acc_norm\": 0.7740918392624115,\n \"acc_norm_stderr\": 0.028444170747139307,\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5784493836096679,\n \"mc2_stderr\": 0.015412397526106352\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600933,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.663114917347142,\n \"acc_stderr\": 0.00471679287443321,\n \"acc_norm\": 0.857697669786895,\n \"acc_norm_stderr\": 0.0034864596026062417\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062246,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062246\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0261488180184245,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0261488180184245\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04434600701584925,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04434600701584925\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131183,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131183\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7116402116402116,\n \"acc_stderr\": 0.023330654054535892,\n \"acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.023330654054535892\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9064516129032258,\n \"acc_stderr\": 0.01656575466827098,\n \"acc_norm\": 0.9064516129032258,\n \"acc_norm_stderr\": 0.01656575466827098\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.02602465765165619,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.02602465765165619\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396985,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396985\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.030401786406101507,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.030401786406101507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.022448264476832593,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.022448264476832593\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.016514409561025796,\n \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.016514409561025796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.02802188803860944,\n \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.02802188803860944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7642458100558659,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.7642458100558659,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.02167005888551079,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.02167005888551079\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957187,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957187\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6121251629726207,\n \"acc_stderr\": 0.012444998309675633,\n \"acc_norm\": 0.6121251629726207,\n \"acc_norm_stderr\": 0.012444998309675633\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010112997,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010112997\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.014957635756617647,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.014957635756617647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916633,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5784493836096679,\n \"mc2_stderr\": 0.015412397526106352\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \"acc_stderr\": 0.01341395509596531\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-09-27.293582.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["**/details_harness|winogrande|5_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T11-09-27.293582.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T11_09_27.293582", "path": ["results_2023-12-10T11-09-27.293582.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T11-09-27.293582.parquet"]}]}]} | 2023-12-10T11:13:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T11:09:27.293582(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically crea... | [
6,
44,
31,
193,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created durin... |
b67d2d6864258cf4ce8cb9108a6626751505cc33 | # Dataset Card for "rapidapi-example-responses-summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-summaries | [
"region:us"
] | 2023-12-10T11:24:49+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "usage", "struct": [{"name": "completion_tokens", "dtype": "int64"}, {"name": "prompt_tokens", "dtype": "int64"}, {"name": "total_tokens", "dtype": "int64"}]}], "splits": [{"name": "train", "num_bytes": 667897, "num_examples": 1000}], "download_size": 276970, "dataset_size": 667897}} | 2023-12-11T10:06:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-summaries"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-summaries\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-summaries\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-summaries\"\n\nMore Information needed"
] |
e1b9d335c4d0b6eee4e9134b97141652af25a3e5 | dataset_info:
features:
- name: tat
dtype: string
- name: rus
dtype: string
| IPSAN/tatar-russian-parallel-corpora | [
"region:us"
] | 2023-12-10T11:30:04+00:00 | {} | 2023-12-23T03:59:16+00:00 | [] | [] | TAGS
#region-us
| dataset_info:
features:
- name: tat
dtype: string
- name: rus
dtype: string
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
e04880c650c31e01af4befcb2da03edf968fc2ef | # Dataset Card for "ds_rplan_full_rplanpy_floorplan_to_color"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ekuhn/ds_rplan_full_rplanpy_floorplan_to_color | [
"region:us"
] | 2023-12-10T11:42:11+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "img", "struct": [{"name": "bytes", "dtype": "binary"}, {"name": "path", "dtype": "null"}]}, {"name": "num_rooms", "dtype": "int64"}], "splits": [{"name": "full", "num_bytes": 96780371, "num_examples": 80788}], "download_size": 52010769, "dataset_size": 96780371}} | 2023-12-10T11:42:45+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ds_rplan_full_rplanpy_floorplan_to_color"
More Information needed | [
"# Dataset Card for \"ds_rplan_full_rplanpy_floorplan_to_color\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ds_rplan_full_rplanpy_floorplan_to_color\"\n\nMore Information needed"
] | [
6,
28
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ds_rplan_full_rplanpy_floorplan_to_color\"\n\nMore Information needed"
] |
79f94fd7780b93f37aa1100a39bb54e4b99c8236 |
# Dataset Card for Evaluation run of Q-bert/Merged-AGI-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Q-bert/Merged-AGI-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Q-bert/Merged-AGI-7B](https://huggingface.co/Q-bert/Merged-AGI-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__Merged-AGI-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:41:40.859542](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Merged-AGI-7B/blob/main/results_2023-12-10T11-41-40.859542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6531012862827063,
"acc_stderr": 0.03195381405127504,
"acc_norm": 0.6544274230849765,
"acc_norm_stderr": 0.03259816823473359,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6023562221172835,
"mc2_stderr": 0.015249864323171023
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.01386415215917728,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726302
},
"harness|hellaswag|10": {
"acc": 0.6772555267874926,
"acc_stderr": 0.004665704208339041,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.003446330748963712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524558,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028072,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028072
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881876,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881876
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6023562221172835,
"mc2_stderr": 0.015249864323171023
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920524
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748828
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Q-bert__Merged-AGI-7B | [
"region:us"
] | 2023-12-10T11:44:32+00:00 | {"pretty_name": "Evaluation run of Q-bert/Merged-AGI-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/Merged-AGI-7B](https://huggingface.co/Q-bert/Merged-AGI-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__Merged-AGI-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T11:41:40.859542](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Merged-AGI-7B/blob/main/results_2023-12-10T11-41-40.859542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6531012862827063,\n \"acc_stderr\": 0.03195381405127504,\n \"acc_norm\": 0.6544274230849765,\n \"acc_norm_stderr\": 0.03259816823473359,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6023562221172835,\n \"mc2_stderr\": 0.015249864323171023\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.01386415215917728,\n \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726302\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6772555267874926,\n \"acc_stderr\": 0.004665704208339041,\n \"acc_norm\": 0.8615813582951604,\n \"acc_norm_stderr\": 0.003446330748963712\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524558,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028072,\n \"acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028072\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881876,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881876\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6023562221172835,\n \"mc2_stderr\": 0.015249864323171023\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748828\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/Merged-AGI-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-41-40.859542.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["**/details_harness|winogrande|5_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T11-41-40.859542.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T11_41_40.859542", "path": ["results_2023-12-10T11-41-40.859542.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T11-41-40.859542.parquet"]}]}]} | 2023-12-10T11:45:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/Merged-AGI-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Q-bert/Merged-AGI-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T11:41:40.859542(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Q-bert/Merged-AGI-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Merged-AGI-7B on the Open L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/Merged-AGI-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Me... | [
6,
20,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Q-bert/Merged-AGI-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Merged-AGI-... |
3f7058c21d6a78f9a5c03c618dd5ece155bcc417 |
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:50:37.068936](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity/blob/main/results_2023-12-10T11-50-37.068936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7674525732857425,
"acc_stderr": 0.028092943162744702,
"acc_norm": 0.7731400785419068,
"acc_norm_stderr": 0.028608512168230946,
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5763314615956924,
"mc2_stderr": 0.01543636329925335
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.01375206241981783
},
"harness|hellaswag|10": {
"acc": 0.663612826130253,
"acc_stderr": 0.0047150751198345095,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.003494581076398526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693612,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.023893351834464317,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.023893351834464317
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270698,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270698
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277696,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277696
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7195767195767195,
"acc_stderr": 0.023135287974325628,
"acc_norm": 0.7195767195767195,
"acc_norm_stderr": 0.023135287974325628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9225806451612903,
"acc_stderr": 0.015203644420774848,
"acc_norm": 0.9225806451612903,
"acc_norm_stderr": 0.015203644420774848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03255086769970104,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03255086769970104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723333,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.019776601086550032,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.019776601086550032
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.44074074074074077,
"acc_stderr": 0.030270671157284074,
"acc_norm": 0.44074074074074077,
"acc_norm_stderr": 0.030270671157284074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.022448264476832597,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.022448264476832597
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5165562913907285,
"acc_stderr": 0.04080244185628972,
"acc_norm": 0.5165562913907285,
"acc_norm_stderr": 0.04080244185628972
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869622,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869622
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.909323116219668,
"acc_stderr": 0.010268429662528547,
"acc_norm": 0.909323116219668,
"acc_norm_stderr": 0.010268429662528547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490714,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490714
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7698324022346369,
"acc_stderr": 0.014078339253425814,
"acc_norm": 0.7698324022346369,
"acc_norm_stderr": 0.014078339253425814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.02117062301121351,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.02117062301121351
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.02167005888551079,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.02167005888551079
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.01887735383957187,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.01887735383957187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6134289439374185,
"acc_stderr": 0.012437288868088727,
"acc_norm": 0.6134289439374185,
"acc_norm_stderr": 0.012437288868088727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8492647058823529,
"acc_stderr": 0.021734235515652848,
"acc_norm": 0.8492647058823529,
"acc_norm_stderr": 0.021734235515652848
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8349673202614379,
"acc_stderr": 0.015017550799247322,
"acc_norm": 0.8349673202614379,
"acc_norm_stderr": 0.015017550799247322
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659386,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659386
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4259485924112607,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.5763314615956924,
"mc2_stderr": 0.01543636329925335
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494044
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity | [
"region:us"
] | 2023-12-10T11:53:26+00:00 | {"pretty_name": "Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity](https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T11:50:37.068936](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity/blob/main/results_2023-12-10T11-50-37.068936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7674525732857425,\n \"acc_stderr\": 0.028092943162744702,\n \"acc_norm\": 0.7731400785419068,\n \"acc_norm_stderr\": 0.028608512168230946,\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.5763314615956924,\n \"mc2_stderr\": 0.01543636329925335\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.663612826130253,\n \"acc_stderr\": 0.0047150751198345095,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.003494581076398526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693612,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693612\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.023893351834464317,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.023893351834464317\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496228,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496228\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.04489539350270698,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.04489539350270698\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277696,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277696\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7195767195767195,\n \"acc_stderr\": 0.023135287974325628,\n \"acc_norm\": 0.7195767195767195,\n \"acc_norm_stderr\": 0.023135287974325628\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9225806451612903,\n \"acc_stderr\": 0.015203644420774848,\n \"acc_norm\": 0.9225806451612903,\n \"acc_norm_stderr\": 0.015203644420774848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03255086769970104,\n \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03255086769970104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.019776601086550032,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.019776601086550032\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.44074074074074077,\n \"acc_stderr\": 0.030270671157284074,\n \"acc_norm\": 0.44074074074074077,\n \"acc_norm_stderr\": 0.030270671157284074\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.022448264476832597,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.022448264476832597\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n \"acc_stderr\": 0.010268429662528547,\n \"acc_norm\": 0.909323116219668,\n \"acc_norm_stderr\": 0.010268429662528547\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7698324022346369,\n \"acc_stderr\": 0.014078339253425814,\n \"acc_norm\": 0.7698324022346369,\n \"acc_norm_stderr\": 0.014078339253425814\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.02117062301121351,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.02117062301121351\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.02167005888551079,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.02167005888551079\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.01887735383957187,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.01887735383957187\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6134289439374185,\n \"acc_stderr\": 0.012437288868088727,\n \"acc_norm\": 0.6134289439374185,\n \"acc_norm_stderr\": 0.012437288868088727\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8492647058823529,\n \"acc_stderr\": 0.021734235515652848,\n \"acc_norm\": 0.8492647058823529,\n \"acc_norm_stderr\": 0.021734235515652848\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8349673202614379,\n \"acc_stderr\": 0.015017550799247322,\n \"acc_norm\": 0.8349673202614379,\n \"acc_norm_stderr\": 0.015017550799247322\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659386,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659386\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.5763314615956924,\n \"mc2_stderr\": 0.01543636329925335\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \"acc_stderr\": 0.013504357787494044\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["**/details_harness|winogrande|5_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T11-50-37.068936.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T11_50_37.068936", "path": ["results_2023-12-10T11-50-37.068936.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T11-50-37.068936.parquet"]}]}]} | 2023-12-10T11:54:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T11:50:37.068936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluat... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically c... | [
6,
46,
31,
195,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created du... |
a2f89d81cda22ce0fe70717387f9f38de027ec62 |
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Falcon-1B](https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T12:28:40.127971](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B/blob/main/results_2023-12-10T12-28-40.127971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2504304926123702,
"acc_stderr": 0.03055596076992834,
"acc_norm": 0.2520875598433206,
"acc_norm_stderr": 0.031361161079445435,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.38469934472881057,
"mc2_stderr": 0.014966198091063187
},
"harness|arc:challenge|25": {
"acc": 0.28498293515358364,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.310580204778157,
"acc_norm_stderr": 0.013522292098053054
},
"harness|hellaswag|10": {
"acc": 0.42381995618402707,
"acc_stderr": 0.0049315259610357536,
"acc_norm": 0.5491933877713603,
"acc_norm_stderr": 0.004965572246803867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.030167533468632688,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.030167533468632688
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756191,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756191
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.023904914311782644,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.023904914311782644
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.17098445595854922,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.17098445595854922,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2036697247706422,
"acc_stderr": 0.0172667420876308,
"acc_norm": 0.2036697247706422,
"acc_norm_stderr": 0.0172667420876308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993666,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915206,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915206
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891155,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891155
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878565,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279336,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.01728276069516742,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.01728276069516742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.38469934472881057,
"mc2_stderr": 0.014966198091063187
},
"harness|winogrande|5": {
"acc": 0.5540647198105761,
"acc_stderr": 0.01397009348233069
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B | [
"region:us"
] | 2023-12-10T12:30:47+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Walter-Falcon-1B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Falcon-1B](https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T12:28:40.127971](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B/blob/main/results_2023-12-10T12-28-40.127971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2504304926123702,\n \"acc_stderr\": 0.03055596076992834,\n \"acc_norm\": 0.2520875598433206,\n \"acc_norm_stderr\": 0.031361161079445435,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080515,\n \"mc2\": 0.38469934472881057,\n \"mc2_stderr\": 0.014966198091063187\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.28498293515358364,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.310580204778157,\n \"acc_norm_stderr\": 0.013522292098053054\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42381995618402707,\n \"acc_stderr\": 0.0049315259610357536,\n \"acc_norm\": 0.5491933877713603,\n \"acc_norm_stderr\": 0.004965572246803867\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.030167533468632688,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.030167533468632688\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756191,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756191\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n \"acc_stderr\": 0.023904914311782644,\n \"acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.023904914311782644\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.026552207828215286,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.026552207828215286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.17098445595854922,\n \"acc_stderr\": 0.02717121368316453,\n \"acc_norm\": 0.17098445595854922,\n \"acc_norm_stderr\": 0.02717121368316453\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128002,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128002\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.2036697247706422,\n \"acc_stderr\": 0.0172667420876308,\n \"acc_norm\": 0.2036697247706422,\n \"acc_norm_stderr\": 0.0172667420876308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993666,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29535864978902954,\n \"acc_stderr\": 0.029696338713422882,\n \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.029696338713422882\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915206,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915206\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891155,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891155\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n \"acc_stderr\": 0.016095302969878565,\n \"acc_norm\": 0.2822477650063857,\n \"acc_norm_stderr\": 0.016095302969878565\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.2282958199356913,\n \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279336,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.01728276069516742,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.01728276069516742\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.02752963744017492,\n \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.02752963744017492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.26865671641791045,\n \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080515,\n \"mc2\": 0.38469934472881057,\n \"mc2_stderr\": 0.014966198091063187\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5540647198105761,\n \"acc_stderr\": 0.01397009348233069\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|arc:challenge|25_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|gsm8k|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hellaswag|10_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["**/details_harness|winogrande|5_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T12-28-40.127971.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_10T12_28_40.127971", "path": ["results_2023-12-10T12-28-40.127971.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T12-28-40.127971.parquet"]}]}]} | 2023-12-10T12:31:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/Walter-Falcon-1B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T12:28:40.127971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Walter... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJa... |
8bc090fb12dd2221b7cf0d79155112b0cb8a0f1e | <p align="center">
<img src="https://s11.ax1x.com/2024/02/01/pFMDAm9.png" width="250" style="margin-bottom: 0.2;"/>
<p>
<h2 align="center"> <a href="https://arxiv.org/pdf/2310.01852.pdf">【ICLR 2024 🔥】LanguageBind: Extending Video-Language Pretraining to N-modality by Language-based Semantic Alignment</a></h2>
<h5 align="center"> If you like our project, please give us a star ⭐ on GitHub for latest update. </h2>
## 📰 News
* **[2024.01.27]** 👀👀👀 Our [MoE-LLaVA](https://github.com/PKU-YuanGroup/MoE-LLaVA) is released! A sparse model with 3B parameters outperformed the dense model with 7B parameters.
* **[2024.01.16]** 🔥🔥🔥 Our LanguageBind has been accepted at ICLR 2024! We earn the score of 6(3)8(6)6(6)6(6) [here](https://openreview.net/forum?id=QmZKc7UZCy¬eId=OgsxQxAleA).
* **[2023.12.15]** 💪💪💪 We expand the 💥💥💥 VIDAL dataset and now have **10M video-text data**. We launch **LanguageBind_Video 1.5**, checking our [model zoo](#-model-zoo).
* **[2023.12.10]** We expand the 💥💥💥 VIDAL dataset and now have **10M depth and 10M thermal data**. We are in the process of uploading thermal and depth data on [Hugging Face](https://huggingface.co/datasets/LanguageBind/VIDAL-Depth-Thermal) and expect the whole process to last 1-2 months.
* **[2023.11.27]** 🔥🔥🔥 We have updated our [paper](https://arxiv.org/abs/2310.01852) with emergency zero-shot results., checking our ✨ [results](#emergency-results).
* **[2023.11.26]** 💥💥💥 We have open-sourced all textual sources and corresponding YouTube IDs [here](DATASETS.md).
* **[2023.11.26]** 📣📣📣 We have open-sourced fully fine-tuned **Video & Audio**, achieving improved performance once again, checking our [model zoo](#-model-zoo).
* **[2023.11.22]** We are about to release a fully fine-tuned version, and the **HUGE** version is currently undergoing training.
* **[2023.11.21]** 💥 We are releasing sample data in [DATASETS.md](DATASETS.md) so that individuals who are interested can further modify the code to train it on their own data.
* **[2023.11.20]** 🚀🚀🚀 [Video-LLaVA](https://github.com/PKU-YuanGroup/Video-LLaVA) builds a large visual-language model to achieve 🎉SOTA performances based on LanguageBind encoders.
* **[2023.10.23]** 🎶 LanguageBind-Audio achieves 🎉🎉🎉**state-of-the-art (SOTA) performance on 5 datasets**, checking our ✨ [results](#multiple-modalities)!
* **[2023.10.14]** 😱 Released a stronger LanguageBind-Video, checking our ✨ [results](#video-language)! The video checkpoint **have updated** on Huggingface Model Hub!
* **[2023.10.10]** We provide sample data, which can be found in [assets](assets), and [emergency zero-shot usage](#emergency-zero-shot) is described.
* **[2023.10.07]** The checkpoints are available on 🤗 [Huggingface Model](https://huggingface.co/LanguageBind).
* **[2023.10.04]** Code and [demo](https://huggingface.co/spaces/LanguageBind/LanguageBind) are available now! Welcome to **watch** 👀 this repository for the latest updates.
## 😮 Highlights
### 💡 High performance, but NO intermediate modality required
LanguageBind is a **language-centric** multimodal pretraining approach, **taking the language as the bind across different modalities** because the language modality is well-explored and contains rich semantics.
* The following first figure shows the architecture of LanguageBind. LanguageBind can be easily extended to segmentation, detection tasks, and potentially to unlimited modalities.
### ⚡️ A multimodal, fully aligned and voluminous dataset
We propose **VIDAL-10M**, **10 Million data** with **V**ideo, **I**nfrared, **D**epth, **A**udio and their corresponding **L**anguage, which greatly expands the data beyond visual modalities.
* The second figure shows our proposed VIDAL-10M dataset, which includes five modalities: video, infrared, depth, audio, and language.
### 🔥 Multi-view enhanced description for training
We make multi-view enhancements to language. We produce multi-view description that combines **meta-data**, **spatial**, and **temporal** to greatly enhance the semantic information of the language. In addition we further **enhance the language with ChatGPT** to create a good semantic space for each modality aligned language.
## 🤗 Demo
* **Local demo.** Highly recommend trying out our web demo, which incorporates all features currently supported by LanguageBind.
```bash
python gradio_app.py
```
* **Online demo.** We provide the [online demo](https://huggingface.co/spaces/LanguageBind/LanguageBind) in Huggingface Spaces. In this demo, you can calculate the similarity of modalities to language, such as audio-to-language, video-to-language, and depth-to-image.
## 🛠️ Requirements and Installation
* Python >= 3.8
* Pytorch >= 1.13.1
* CUDA Version >= 11.6
* Install required packages:
```bash
git clone https://github.com/PKU-YuanGroup/LanguageBind
cd LanguageBind
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116
pip install -r requirements.txt
```
## 🐳 Model Zoo
The names in the table represent different encoder models. For example, `LanguageBind/LanguageBind_Video_FT` represents the fully fine-tuned version, while `LanguageBind/LanguageBind_Video` represents the LoRA-tuned version.
You can freely replace them in the recommended [API usage](#-api). We recommend using the fully fine-tuned version, as it offers stronger performance.
<div align="center">
<table border="1" width="100%">
<tr align="center">
<th>Modality</th><th>LoRA tuning</th><th>Fine-tuning</th>
</tr>
<tr align="center">
<td>Video</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video">LanguageBind_Video</a></td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video_FT">LanguageBind_Video_FT</a></td>
</tr>
<tr align="center">
<td>Audio</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Audio">LanguageBind_Audio</a></td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Audio_FT">LanguageBind_Audio_FT</a></td>
</tr>
<tr align="center">
<td>Depth</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Depth">LanguageBind_Depth</a></td><td>-</td>
</tr>
<tr align="center">
<td>Thermal</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Thermal">LanguageBind_Thermal</a></td><td>-</td>
</tr>
</table>
</div>
<div align="center">
<table border="1" width="100%">
<tr align="center">
<th>Version</th><th>Tuning</th><th>Model size</th><th>Num_frames</th><th>HF Link</th><th>MSR-VTT</th><th>DiDeMo</th><th>ActivityNet</th><th>MSVD</th>
</tr>
<tr align="center">
<td>LanguageBind_Video</td><td>LoRA</td><td>Large</td><td>8</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video">Link</a></td><td>42.6</td><td>37.8</td><td>35.1</td><td>52.2</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_FT</td><td>Full-tuning</td><td>Large</td><td>8</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video_FT">Link</a></td><td>42.7</td><td>38.1</td><td>36.9</td><td>53.5</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_V1.5_FT</td><td>Full-tuning</td><td>Large</td><td>8</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video_V1.5_FT">Link</a></td><td>42.8</td><td>39.7</td><td>38.4</td><td>54.1</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_V1.5_FT</td><td>Full-tuning</td><td>Large</td><td>12</td><td>Coming soon</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_Huge_V1.5_FT</td><td>Full-tuning</td><td>Huge</td><td>8</td><td><a href="https://huggingface.co/LanguageBind/LanguageBind_Video_Huge_V1.5_FT">Link</a></td><td>44.8</td><td>39.9</td><td>41.0</td><td>53.7</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_Huge_V1.5_FT</td><td>Full-tuning</td><td>Huge</td><td>12</td><td>Coming soon</td>
</tr>
</table>
</div>
## 🤖 API
**We open source all modalities preprocessing code.** If you want to load the model (e.g. ```LanguageBind/LanguageBind_Thermal```) from the model hub on Huggingface or on local, you can use the following code snippets!
### Inference for Multi-modal Binding
We have provided some sample datasets in [assets](assets) to quickly see how languagebind works.
```python
import torch
from languagebind import LanguageBind, to_device, transform_dict, LanguageBindImageTokenizer
if __name__ == '__main__':
device = 'cuda:0'
device = torch.device(device)
clip_type = {
'video': 'LanguageBind_Video_FT', # also LanguageBind_Video
'audio': 'LanguageBind_Audio_FT', # also LanguageBind_Audio
'thermal': 'LanguageBind_Thermal',
'image': 'LanguageBind_Image',
'depth': 'LanguageBind_Depth',
}
model = LanguageBind(clip_type=clip_type, cache_dir='./cache_dir')
model = model.to(device)
model.eval()
pretrained_ckpt = f'lb203/LanguageBind_Image'
tokenizer = LanguageBindImageTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir/tokenizer_cache_dir')
modality_transform = {c: transform_dict[c](model.modality_config[c]) for c in clip_type.keys()}
image = ['assets/image/0.jpg', 'assets/image/1.jpg']
audio = ['assets/audio/0.wav', 'assets/audio/1.wav']
video = ['assets/video/0.mp4', 'assets/video/1.mp4']
depth = ['assets/depth/0.png', 'assets/depth/1.png']
thermal = ['assets/thermal/0.jpg', 'assets/thermal/1.jpg']
language = ["Training a parakeet to climb up a ladder.", 'A lion climbing a tree to catch a monkey.']
inputs = {
'image': to_device(modality_transform['image'](image), device),
'video': to_device(modality_transform['video'](video), device),
'audio': to_device(modality_transform['audio'](audio), device),
'depth': to_device(modality_transform['depth'](depth), device),
'thermal': to_device(modality_transform['thermal'](thermal), device),
}
inputs['language'] = to_device(tokenizer(language, max_length=77, padding='max_length',
truncation=True, return_tensors='pt'), device)
with torch.no_grad():
embeddings = model(inputs)
print("Video x Text: \n",
torch.softmax(embeddings['video'] @ embeddings['language'].T, dim=-1).detach().cpu().numpy())
print("Image x Text: \n",
torch.softmax(embeddings['image'] @ embeddings['language'].T, dim=-1).detach().cpu().numpy())
print("Depth x Text: \n",
torch.softmax(embeddings['depth'] @ embeddings['language'].T, dim=-1).detach().cpu().numpy())
print("Audio x Text: \n",
torch.softmax(embeddings['audio'] @ embeddings['language'].T, dim=-1).detach().cpu().numpy())
print("Thermal x Text: \n",
torch.softmax(embeddings['thermal'] @ embeddings['language'].T, dim=-1).detach().cpu().numpy())
```
Then returns the following result.
```bash
Video x Text:
[[9.9989331e-01 1.0667283e-04]
[1.3255903e-03 9.9867439e-01]]
Image x Text:
[[9.9990666e-01 9.3292067e-05]
[4.6132666e-08 1.0000000e+00]]
Depth x Text:
[[0.9954276 0.00457235]
[0.12042473 0.8795753 ]]
Audio x Text:
[[0.97634876 0.02365119]
[0.02917843 0.97082156]]
Thermal x Text:
[[0.9482511 0.0517489 ]
[0.48746133 0.5125386 ]]
```
### Emergency zero-shot
Since languagebind binds each modality together, we also found the **emergency zero-shot**. It's very simple to use.
```python
print("Video x Audio: \n", torch.softmax(embeddings['video'] @ embeddings['audio'].T, dim=-1).detach().cpu().numpy())
print("Image x Depth: \n", torch.softmax(embeddings['image'] @ embeddings['depth'].T, dim=-1).detach().cpu().numpy())
print("Image x Thermal: \n", torch.softmax(embeddings['image'] @ embeddings['thermal'].T, dim=-1).detach().cpu().numpy())
```
Then, you will get:
```
Video x Audio:
[[1.0000000e+00 0.0000000e+00]
[3.1150486e-32 1.0000000e+00]]
Image x Depth:
[[1. 0.]
[0. 1.]]
Image x Thermal:
[[1. 0.]
[0. 1.]]
```
### Different branches for X-Language task
Additionally, LanguageBind can be **disassembled into different branches** to handle different tasks. Note that we do not train Image, which just initialize from OpenCLIP.
#### Thermal
```python
import torch
from languagebind import LanguageBindThermal, LanguageBindThermalTokenizer, LanguageBindThermalProcessor
pretrained_ckpt = 'LanguageBind/LanguageBind_Thermal'
model = LanguageBindThermal.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
tokenizer = LanguageBindThermalTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
thermal_process = LanguageBindThermalProcessor(model.config, tokenizer)
model.eval()
data = thermal_process([r"your/thermal.jpg"], ['your text'], return_tensors='pt')
with torch.no_grad():
out = model(**data)
print(out.text_embeds @ out.image_embeds.T)
```
#### Depth
```python
import torch
from languagebind import LanguageBindDepth, LanguageBindDepthTokenizer, LanguageBindDepthProcessor
pretrained_ckpt = 'LanguageBind/LanguageBind_Depth'
model = LanguageBindDepth.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
tokenizer = LanguageBindDepthTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
depth_process = LanguageBindDepthProcessor(model.config, tokenizer)
model.eval()
data = depth_process([r"your/depth.png"], ['your text.'], return_tensors='pt')
with torch.no_grad():
out = model(**data)
print(out.text_embeds @ out.image_embeds.T)
```
#### Video
```python
import torch
from languagebind import LanguageBindVideo, LanguageBindVideoTokenizer, LanguageBindVideoProcessor
pretrained_ckpt = 'LanguageBind/LanguageBind_Video_FT' # also 'LanguageBind/LanguageBind_Video'
model = LanguageBindVideo.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
tokenizer = LanguageBindVideoTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
video_process = LanguageBindVideoProcessor(model.config, tokenizer)
model.eval()
data = video_process(["your/video.mp4"], ['your text.'], return_tensors='pt')
with torch.no_grad():
out = model(**data)
print(out.text_embeds @ out.image_embeds.T)
```
#### Audio
```python
import torch
from languagebind import LanguageBindAudio, LanguageBindAudioTokenizer, LanguageBindAudioProcessor
pretrained_ckpt = 'LanguageBind/LanguageBind_Audio_FT' # also 'LanguageBind/LanguageBind_Audio'
model = LanguageBindAudio.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
tokenizer = LanguageBindAudioTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
audio_process = LanguageBindAudioProcessor(model.config, tokenizer)
model.eval()
data = audio_process([r"your/audio.wav"], ['your audio.'], return_tensors='pt')
with torch.no_grad():
out = model(**data)
print(out.text_embeds @ out.image_embeds.T)
```
#### Image
Note that our image encoder is the same as OpenCLIP. **Not** as fine-tuned as other modalities.
```python
import torch
from languagebind import LanguageBindImage, LanguageBindImageTokenizer, LanguageBindImageProcessor
pretrained_ckpt = 'LanguageBind/LanguageBind_Image'
model = LanguageBindImage.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
tokenizer = LanguageBindImageTokenizer.from_pretrained(pretrained_ckpt, cache_dir='./cache_dir')
image_process = LanguageBindImageProcessor(model.config, tokenizer)
model.eval()
data = image_process([r"your/image.jpg"], ['your text.'], return_tensors='pt')
with torch.no_grad():
out = model(**data)
print(out.text_embeds @ out.image_embeds.T)
```
## 💥 VIDAL-10M
The datasets is in [DATASETS.md](DATASETS.md).
## 🗝️ Training & Validating
The training & validating instruction is in [TRAIN_AND_VALIDATE.md](TRAIN_AND_VALIDATE.md).
## 👍 Acknowledgement
* [OpenCLIP](https://github.com/mlfoundations/open_clip) An open source pretraining framework.
* [CLIP4Clip](https://github.com/ArrowLuo/CLIP4Clip) An open source Video-Text retrieval framework.
* [sRGB-TIR](https://github.com/rpmsnu/sRGB-TIR) An open source framework to generate infrared (thermal) images.
* [GLPN](https://github.com/vinvino02/GLPDepth) An open source framework to generate depth images.
## 🔒 License
* The majority of this project is released under the MIT license as found in the [LICENSE](https://github.com/PKU-YuanGroup/LanguageBind/blob/main/LICENSE) file.
* The dataset of this project is released under the CC-BY-NC 4.0 license as found in the [DATASET_LICENSE](https://github.com/PKU-YuanGroup/LanguageBind/blob/main/DATASET_LICENSE) file.
## ✏️ Citation
If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.
```BibTeX
@misc{zhu2023languagebind,
title={LanguageBind: Extending Video-Language Pretraining to N-modality by Language-based Semantic Alignment},
author={Bin Zhu and Bin Lin and Munan Ning and Yang Yan and Jiaxi Cui and Wang HongFa and Yatian Pang and Wenhao Jiang and Junwu Zhang and Zongwei Li and Cai Wan Zhang and Zhifeng Li and Wei Liu and Li Yuan},
year={2023},
eprint={2310.01852},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## ✨ Star History
[](https://star-history.com/#PKU-YuanGroup/LanguageBind&Date)
## 🤝 Contributors
<a href="https://github.com/PKU-YuanGroup/LanguageBind/graphs/contributors">
<img src="https://contrib.rocks/image?repo=PKU-YuanGroup/LanguageBind" />
</a>
| LanguageBind/VIDAL-Depth-Thermal | [
"license:mit",
"arxiv:2310.01852",
"region:us"
] | 2023-12-10T12:37:49+00:00 | {"license": "mit"} | 2024-02-01T06:58:31+00:00 | [
"2310.01852"
] | [] | TAGS
#license-mit #arxiv-2310.01852 #region-us
| <p align="center">
<img src="URL width="250" style="margin-bottom: 0.2;"/>
<p>
<h2 align="center"> <a href="URL【ICLR 2024 】LanguageBind: Extending Video-Language Pretraining to N-modality by Language-based Semantic Alignment</a></h2>
<h5 align="center"> If you like our project, please give us a star ⭐ on GitHub for latest update. </h2>
## News
* [2024.01.27] Our MoE-LLaVA is released! A sparse model with 3B parameters outperformed the dense model with 7B parameters.
* [2024.01.16] Our LanguageBind has been accepted at ICLR 2024! We earn the score of 6(3)8(6)6(6)6(6) here.
* [2023.12.15] We expand the VIDAL dataset and now have 10M video-text data. We launch LanguageBind_Video 1.5, checking our model zoo.
* [2023.12.10] We expand the VIDAL dataset and now have 10M depth and 10M thermal data. We are in the process of uploading thermal and depth data on Hugging Face and expect the whole process to last 1-2 months.
* [2023.11.27] We have updated our paper with emergency zero-shot results., checking our results.
* [2023.11.26] We have open-sourced all textual sources and corresponding YouTube IDs here.
* [2023.11.26] We have open-sourced fully fine-tuned Video & Audio, achieving improved performance once again, checking our model zoo.
* [2023.11.22] We are about to release a fully fine-tuned version, and the HUGE version is currently undergoing training.
* [2023.11.21] We are releasing sample data in URL so that individuals who are interested can further modify the code to train it on their own data.
* [2023.11.20] Video-LLaVA builds a large visual-language model to achieve SOTA performances based on LanguageBind encoders.
* [2023.10.23] LanguageBind-Audio achieves state-of-the-art (SOTA) performance on 5 datasets, checking our results!
* [2023.10.14] Released a stronger LanguageBind-Video, checking our results! The video checkpoint have updated on Huggingface Model Hub!
* [2023.10.10] We provide sample data, which can be found in assets, and emergency zero-shot usage is described.
* [2023.10.07] The checkpoints are available on Huggingface Model.
* [2023.10.04] Code and demo are available now! Welcome to watch this repository for the latest updates.
## Highlights
### High performance, but NO intermediate modality required
LanguageBind is a language-centric multimodal pretraining approach, taking the language as the bind across different modalities because the language modality is well-explored and contains rich semantics.
* The following first figure shows the architecture of LanguageBind. LanguageBind can be easily extended to segmentation, detection tasks, and potentially to unlimited modalities.
### ️ A multimodal, fully aligned and voluminous dataset
We propose VIDAL-10M, 10 Million data with Video, Infrared, Depth, Audio and their corresponding Language, which greatly expands the data beyond visual modalities.
* The second figure shows our proposed VIDAL-10M dataset, which includes five modalities: video, infrared, depth, audio, and language.
### Multi-view enhanced description for training
We make multi-view enhancements to language. We produce multi-view description that combines meta-data, spatial, and temporal to greatly enhance the semantic information of the language. In addition we further enhance the language with ChatGPT to create a good semantic space for each modality aligned language.
## Demo
* Local demo. Highly recommend trying out our web demo, which incorporates all features currently supported by LanguageBind.
* Online demo. We provide the online demo in Huggingface Spaces. In this demo, you can calculate the similarity of modalities to language, such as audio-to-language, video-to-language, and depth-to-image.
## ️ Requirements and Installation
* Python >= 3.8
* Pytorch >= 1.13.1
* CUDA Version >= 11.6
* Install required packages:
## Model Zoo
The names in the table represent different encoder models. For example, 'LanguageBind/LanguageBind_Video_FT' represents the fully fine-tuned version, while 'LanguageBind/LanguageBind_Video' represents the LoRA-tuned version.
You can freely replace them in the recommended API usage. We recommend using the fully fine-tuned version, as it offers stronger performance.
<div align="center">
<table border="1" width="100%">
<tr align="center">
<th>Modality</th><th>LoRA tuning</th><th>Fine-tuning</th>
</tr>
<tr align="center">
<td>Video</td><td><a href="URL href="URL
</tr>
<tr align="center">
<td>Audio</td><td><a href="URL href="URL
</tr>
<tr align="center">
<td>Depth</td><td><a href="URL
</tr>
<tr align="center">
<td>Thermal</td><td><a href="URL
</tr>
</table>
</div>
<div align="center">
<table border="1" width="100%">
<tr align="center">
<th>Version</th><th>Tuning</th><th>Model size</th><th>Num_frames</th><th>HF Link</th><th>MSR-VTT</th><th>DiDeMo</th><th>ActivityNet</th><th>MSVD</th>
</tr>
<tr align="center">
<td>LanguageBind_Video</td><td>LoRA</td><td>Large</td><td>8</td><td><a href="URL
</tr>
<tr align="center">
<td>LanguageBind_Video_FT</td><td>Full-tuning</td><td>Large</td><td>8</td><td><a href="URL
</tr>
<tr align="center">
<td>LanguageBind_Video_V1.5_FT</td><td>Full-tuning</td><td>Large</td><td>8</td><td><a href="URL
</tr>
<tr align="center">
<td>LanguageBind_Video_V1.5_FT</td><td>Full-tuning</td><td>Large</td><td>12</td><td>Coming soon</td>
</tr>
<tr align="center">
<td>LanguageBind_Video_Huge_V1.5_FT</td><td>Full-tuning</td><td>Huge</td><td>8</td><td><a href="URL
</tr>
<tr align="center">
<td>LanguageBind_Video_Huge_V1.5_FT</td><td>Full-tuning</td><td>Huge</td><td>12</td><td>Coming soon</td>
</tr>
</table>
</div>
## API
We open source all modalities preprocessing code. If you want to load the model (e.g. ) from the model hub on Huggingface or on local, you can use the following code snippets!
### Inference for Multi-modal Binding
We have provided some sample datasets in assets to quickly see how languagebind works.
Then returns the following result.
### Emergency zero-shot
Since languagebind binds each modality together, we also found the emergency zero-shot. It's very simple to use.
Then, you will get:
### Different branches for X-Language task
Additionally, LanguageBind can be disassembled into different branches to handle different tasks. Note that we do not train Image, which just initialize from OpenCLIP.
#### Thermal
#### Depth
#### Video
#### Audio
#### Image
Note that our image encoder is the same as OpenCLIP. Not as fine-tuned as other modalities.
## VIDAL-10M
The datasets is in URL.
## ️ Training & Validating
The training & validating instruction is in TRAIN_AND_VALIDATE.md.
## Acknowledgement
* OpenCLIP An open source pretraining framework.
* CLIP4Clip An open source Video-Text retrieval framework.
* sRGB-TIR An open source framework to generate infrared (thermal) images.
* GLPN An open source framework to generate depth images.
## License
* The majority of this project is released under the MIT license as found in the LICENSE file.
* The dataset of this project is released under the CC-BY-NC 4.0 license as found in the DATASET_LICENSE file.
## ️ Citation
If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil:.
## Star History
8(6)6(6)6(6) here.\n* [2023.12.15] We expand the VIDAL dataset and now have 10M vid... | [
"TAGS\n#license-mit #arxiv-2310.01852 #region-us \n",
"## News\n* [2024.01.27] Our MoE-LLaVA is released! A sparse model with 3B parameters outperformed the dense model with 7B parameters.\n* [2024.01.16] Our LanguageBind has been accepted at ICLR 2024! We earn the score of 6(3)8(6)6(6)6(6) here.\n* [2023.12... | [
20,
506,
3,
95,
91,
77,
83,
37,
801,
49,
37,
41,
52,
5,
5,
3,
3,
29,
13,
27,
66,
53,
33,
10,
22
] | [
"passage: TAGS\n#license-mit #arxiv-2310.01852 #region-us \n",
"passage: ## News\n* [2024.01.27] Our MoE-LLaVA is released! A sparse model with 3B parameters outperformed the dense model with 7B parameters.\n* [2024.01.16] Our LanguageBind has been accepted at ICLR 2024! We earn the score of 6(3)8(6)6(6)6(6)... |
5f920a709473cc51bff3900755d930aa9c943a36 | # Dataset Card for "ds_rplan_full_rplanpy_category"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ekuhn/ds_rplan_full_rplanpy_category | [
"region:us"
] | 2023-12-10T12:42:49+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "img", "struct": [{"name": "bytes", "dtype": "binary"}, {"name": "path", "dtype": "null"}]}, {"name": "num_rooms", "dtype": "int64"}], "splits": [{"name": "full", "num_bytes": 68462137, "num_examples": 80788}], "download_size": 36343949, "dataset_size": 68462137}} | 2023-12-10T12:42:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ds_rplan_full_rplanpy_category"
More Information needed | [
"# Dataset Card for \"ds_rplan_full_rplanpy_category\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ds_rplan_full_rplanpy_category\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ds_rplan_full_rplanpy_category\"\n\nMore Information needed"
] |
5fb51c15be815b98663501737c9aa3f69bb106f1 |
## Factorio Blueprint Visualizations SDXL Lora Examples
Examples of the usage of https://huggingface.co/piebro/factorio-blueprint-visualizations-sdxl-lora. The images are generated using 25 inference steps and a guidance_scale of 7. The filenames are composed like this: {counter}\_{seed}\_{prompt}.png. | piebro/factorio-blueprint-visualizations-sdxl-lora-examples | [
"license:cc0-1.0",
"region:us"
] | 2023-12-10T13:25:51+00:00 | {"license": "cc0-1.0", "pretty_name": "Factorio Blueprint Visualizations SDXL Lora Examples"} | 2023-12-10T13:52:02+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
|
## Factorio Blueprint Visualizations SDXL Lora Examples
Examples of the usage of URL The images are generated using 25 inference steps and a guidance_scale of 7. The filenames are composed like this: {counter}\_{seed}\_{prompt}.png. | [
"## Factorio Blueprint Visualizations SDXL Lora Examples\n\nExamples of the usage of URL The images are generated using 25 inference steps and a guidance_scale of 7. The filenames are composed like this: {counter}\\_{seed}\\_{prompt}.png."
] | [
"TAGS\n#license-cc0-1.0 #region-us \n",
"## Factorio Blueprint Visualizations SDXL Lora Examples\n\nExamples of the usage of URL The images are generated using 25 inference steps and a guidance_scale of 7. The filenames are composed like this: {counter}\\_{seed}\\_{prompt}.png."
] | [
14,
69
] | [
"passage: TAGS\n#license-cc0-1.0 #region-us \n## Factorio Blueprint Visualizations SDXL Lora Examples\n\nExamples of the usage of URL The images are generated using 25 inference steps and a guidance_scale of 7. The filenames are composed like this: {counter}\\_{seed}\\_{prompt}.png."
] |
f08c14a945425c57b91aff03c0c4707a53262068 | # Dataset Card for "alpaca-gpt4"
This dataset contains English Instruction-Following generated by GPT-4 using Alpaca prompts for fine-tuning LLMs.
The dataset was originaly shared in this repository: https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM. This is just a wraper for compatibility with huggingface's datasets library.
## Dataset Description
- **Homepage:** https://instruction-tuning-with-gpt-4.github.io
- **Repository:** https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
- **Paper:** https://arxiv.org/abs/2304.03277
## Dataset structure
It contains 52K instruction-following data generated by GPT-4 using the same prompts as in Alpaca.
The dataset has the same format as Alpaca data, except the output is generated by GPT-4:
- `instruction`: `str`, describes the task the model should perform. Each of the 52K instructions is unique.
- `input`: `str`, optional context or input for the task.
- `output`: `str`, the answer to the instruction as generated by `GPT-4`.
- `text`: `str`, all the previous fields concatenated together, plus the same prompt used in Alpaca at the beginnig.
## Difference with the original Alpaca dataset
The original Alpaca dataset used text-davinci-003 to complete the prompts. This dataset uses those same prompts, but generating the completions with GPT-4. Thus, in general, the responses are of higher quality and lenght. Here is an example:
#### Example from Alpaca-GPT4:
```bash
{'instruction': 'Identify the odd one out.',
'input': 'Twitter, Instagram, Telegram',
'output': 'The odd one out is Telegram. Twitter and Instagram are social media platforms mainly for sharing information, images and videos while Telegram is a cloud-based instant messaging and voice-over-IP service.',
'text': 'Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nIdentify the odd one out.\n\n### Input:\nTwitter, Instagram, Telegram\n\n### Response:\nThe odd one out is Telegram. Twitter and Instagram are social media platforms mainly for sharing information, images and videos while Telegram is a cloud-based instant messaging and voice-over-IP service.'}
```
#### Same example from original Alpaca:
```bash
{'instruction': 'Identify the odd one out.',
'input': 'Twitter, Instagram, Telegram',
'output': 'Telegram',
'text': 'Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nIdentify the odd one out.\n\n### Input:\nTwitter, Instagram, Telegram\n\n### Response:\nTelegram'}
```
## Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode). | walkernet/test | [
"task_categories:text-generation",
"task_categories:conversational",
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:ru",
"license:cc-by-nc-4.0",
"gpt4",
"alpaca",
"instruction-finetuning",
"arxiv:2304.03277",
"region:us"
] | 2023-12-10T13:53:58+00:00 | {"language": ["ru"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "conversational", "question-answering"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 88566301, "num_examples": 52002}], "download_size": 48393562, "dataset_size": 88566301}, "tags": ["gpt4", "alpaca", "instruction-finetuning"]} | 2023-12-10T18:07:52+00:00 | [
"2304.03277"
] | [
"ru"
] | TAGS
#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-10K<n<100K #language-Russian #license-cc-by-nc-4.0 #gpt4 #alpaca #instruction-finetuning #arxiv-2304.03277 #region-us
| # Dataset Card for "alpaca-gpt4"
This dataset contains English Instruction-Following generated by GPT-4 using Alpaca prompts for fine-tuning LLMs.
The dataset was originaly shared in this repository: URL This is just a wraper for compatibility with huggingface's datasets library.
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
## Dataset structure
It contains 52K instruction-following data generated by GPT-4 using the same prompts as in Alpaca.
The dataset has the same format as Alpaca data, except the output is generated by GPT-4:
- 'instruction': 'str', describes the task the model should perform. Each of the 52K instructions is unique.
- 'input': 'str', optional context or input for the task.
- 'output': 'str', the answer to the instruction as generated by 'GPT-4'.
- 'text': 'str', all the previous fields concatenated together, plus the same prompt used in Alpaca at the beginnig.
## Difference with the original Alpaca dataset
The original Alpaca dataset used text-davinci-003 to complete the prompts. This dataset uses those same prompts, but generating the completions with GPT-4. Thus, in general, the responses are of higher quality and lenght. Here is an example:
#### Example from Alpaca-GPT4:
#### Same example from original Alpaca:
## Licensing Information
The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0). | [
"# Dataset Card for \"alpaca-gpt4\"\n\nThis dataset contains English Instruction-Following generated by GPT-4 using Alpaca prompts for fine-tuning LLMs.\n\nThe dataset was originaly shared in this repository: URL This is just a wraper for compatibility with huggingface's datasets library.",
"## Dataset Descriptio... | [
"TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-10K<n<100K #language-Russian #license-cc-by-nc-4.0 #gpt4 #alpaca #instruction-finetuning #arxiv-2304.03277 #region-us \n",
"# Dataset Card for \"alpaca-gpt4\"\n\nThis dataset contains Engli... | [
89,
80,
18,
159,
73,
12,
9,
25
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-10K<n<100K #language-Russian #license-cc-by-nc-4.0 #gpt4 #alpaca #instruction-finetuning #arxiv-2304.03277 #region-us \n# Dataset Card for \"alpaca-gpt4\"\n\nThis dataset contains En... |
3d1d81399092a46968b946271198cdecdc171209 |
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
Each instance in the training, development, and test sets is a sentence pair. The instance is labeled with a score representing the degree of semantic textual relatedness between the two sentences. The scores can range from 0 (maximally unrelated) to 1 (maximally related). These gold label scores have been determined through manual annotation. Specifically, a comparative annotation approach was used to avoid known limitations of traditional rating scale annotation methods. This comparative annotation process (which avoids several biases of traditional rating scales) led to a high reliability of the final relatedness rankings. Further details about the task, the method of data annotation, how STR is different from semantic textual similarity, applications of semantic textual relatedness, etc. can be found in this paper.
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [https://github.com/semantic-textual-relatedness/Semantic_Relatedness_SemEval2024/tree/main]
| kietnt0603/SemEval2024-STR | [
"size_categories:10K<n<100K",
"language:am",
"language:ha",
"language:en",
"language:es",
"language:te",
"language:ar",
"language:af",
"license:mit",
"Semantic Textual Relatedness",
"region:us"
] | 2023-12-10T14:10:33+00:00 | {"language": ["am", "ha", "en", "es", "te", "ar", "af"], "license": "mit", "size_categories": ["10K<n<100K"], "tags": ["Semantic Textual Relatedness"], "dataset_info": {"features": [{"name": "PairID", "dtype": "string"}, {"name": "Language", "dtype": "string"}, {"name": "Sentence1", "dtype": "string"}, {"name": "Sentence2", "dtype": "string"}, {"name": "Length", "dtype": "int64"}, {"name": "Score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 4248215, "num_examples": 15123}, {"name": "dev", "num_bytes": 460985, "num_examples": 1390}], "download_size": 2400795, "dataset_size": 4709200}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "dev", "path": "data/dev-*"}]}]} | 2023-12-20T11:05:00+00:00 | [] | [
"am",
"ha",
"en",
"es",
"te",
"ar",
"af"
] | TAGS
#size_categories-10K<n<100K #language-Amharic #language-Hausa #language-English #language-Spanish #language-Telugu #language-Arabic #language-Afrikaans #license-mit #Semantic Textual Relatedness #region-us
|
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
Each instance in the training, development, and test sets is a sentence pair. The instance is labeled with a score representing the degree of semantic textual relatedness between the two sentences. The scores can range from 0 (maximally unrelated) to 1 (maximally related). These gold label scores have been determined through manual annotation. Specifically, a comparative annotation approach was used to avoid known limitations of traditional rating scale annotation methods. This comparative annotation process (which avoids several biases of traditional rating scales) led to a high reliability of the final relatedness rankings. Further details about the task, the method of data annotation, how STR is different from semantic textual similarity, applications of semantic textual relatedness, etc. can be found in this paper.
### Dataset Sources
- Repository: [URL
| [
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\nEach instance in the training, development, and test sets is a sentence pair. The instance is labeled with a score representing the degree of semantic textual relatedness between the two sentences. The scores can range from 0... | [
"TAGS\n#size_categories-10K<n<100K #language-Amharic #language-Hausa #language-English #language-Spanish #language-Telugu #language-Arabic #language-Afrikaans #license-mit #Semantic Textual Relatedness #region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\nEach ... | [
65,
8,
4,
185,
13
] | [
"passage: TAGS\n#size_categories-10K<n<100K #language-Amharic #language-Hausa #language-English #language-Spanish #language-Telugu #language-Arabic #language-Afrikaans #license-mit #Semantic Textual Relatedness #region-us \n# Dataset Card for Dataset Name## Dataset Details### Dataset Description\n\n\nEach instance ... |
234777868459a272a26a04bb8882764b6b873dc5 |

Il s'agit des pdfs preparsés qui peuvent être ensuite utilisé dans des appli autour du NLP / LLMs dans un soucis de collaborations.
Les différents codes ont été extrait en format XML ici : https://codes.droit.org/
Les formats XML permet de faire un meilleurs preprocessing des codes de loi.
La structure des données :
- dans raw/ on retrouve les différents codes en format xml.
- dans notebooks_preprocess/ on retrouve les différents notebooks qui ont permis de constitué le dataset final.
| Forbu14/LoiLibre | [
"language:fr",
"license:apache-2.0",
"legal",
"region:us"
] | 2023-12-10T14:14:54+00:00 | {"language": ["fr"], "license": "apache-2.0", "pretty_name": "LoiLibre", "tags": ["legal"]} | 2023-12-10T19:11:24+00:00 | [] | [
"fr"
] | TAGS
#language-French #license-apache-2.0 #legal #region-us
|
!logo
Il s'agit des pdfs preparsés qui peuvent être ensuite utilisé dans des appli autour du NLP / LLMs dans un soucis de collaborations.
Les différents codes ont été extrait en format XML ici : URL
Les formats XML permet de faire un meilleurs preprocessing des codes de loi.
La structure des données :
- dans raw/ on retrouve les différents codes en format xml.
- dans notebooks_preprocess/ on retrouve les différents notebooks qui ont permis de constitué le dataset final.
| [] | [
"TAGS\n#language-French #license-apache-2.0 #legal #region-us \n"
] | [
22
] | [
"passage: TAGS\n#language-French #license-apache-2.0 #legal #region-us \n"
] |
a7956e63162d4a550e188ccf486ff4abb2b13523 | # Dataset Card for "rapidapi-example-responses-workflows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-workflows | [
"region:us"
] | 2023-12-10T14:19:02+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "steps", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3227895, "num_examples": 1000}], "download_size": 1173873, "dataset_size": 3227895}} | 2023-12-11T22:19:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-workflows"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-workflows\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-workflows\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-workflows\"\n\nMore Information needed"
] |
80e8d667d5cc28e827d880f6723ed373d8bcc3f0 |
# InstructImages
Following dataset created in Dalle3 paper style
1. Caption all images with LVM(Llava13b in my case)
2. Improve captions with GPT4
Also i have a plans to open source RLAIF pipeline with these images. | AlexWortega/InstructCaptions2 | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-10T14:33:07+00:00 | {"language": ["en"], "license": "apache-2.0", "pretty_name": "InstructImages", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 33059118217.928, "num_examples": 22776}], "download_size": 33273147003, "dataset_size": 33059118217.928}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-10T15:07:33+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #region-us
|
# InstructImages
Following dataset created in Dalle3 paper style
1. Caption all images with LVM(Llava13b in my case)
2. Improve captions with GPT4
Also i have a plans to open source RLAIF pipeline with these images. | [
"# InstructImages\nFollowing dataset created in Dalle3 paper style \n1. Caption all images with LVM(Llava13b in my case)\n2. Improve captions with GPT4\n\n\nAlso i have a plans to open source RLAIF pipeline with these images."
] | [
"TAGS\n#language-English #license-apache-2.0 #region-us \n",
"# InstructImages\nFollowing dataset created in Dalle3 paper style \n1. Caption all images with LVM(Llava13b in my case)\n2. Improve captions with GPT4\n\n\nAlso i have a plans to open source RLAIF pipeline with these images."
] | [
18,
59
] | [
"passage: TAGS\n#language-English #license-apache-2.0 #region-us \n# InstructImages\nFollowing dataset created in Dalle3 paper style \n1. Caption all images with LVM(Llava13b in my case)\n2. Improve captions with GPT4\n\n\nAlso i have a plans to open source RLAIF pipeline with these images."
] |
48b8fbf7e914957b5fd58ae98047de795333a5aa | Vtuber tachi-e dataset sclaped from official site
Nijisanji Hololive Vspo Noripro 774inc | junjuice0/vtuber-tachi-e | [
"size_categories:n<1K",
"art",
"region:us"
] | 2023-12-10T14:37:48+00:00 | {"size_categories": ["n<1K"], "tags": ["art"]} | 2023-12-10T15:21:07+00:00 | [] | [] | TAGS
#size_categories-n<1K #art #region-us
| Vtuber tachi-e dataset sclaped from official site
Nijisanji Hololive Vspo Noripro 774inc | [] | [
"TAGS\n#size_categories-n<1K #art #region-us \n"
] | [
18
] | [
"passage: TAGS\n#size_categories-n<1K #art #region-us \n"
] |
f1e5d865006c5422ae47bc6bd5eb6715d78f2004 | # Dataset Card for "rapidapi-example-responses-workflow-steps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | davidfant/rapidapi-example-responses-workflow-steps | [
"region:us"
] | 2023-12-10T14:42:05+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "workflow", "dtype": "string"}, {"name": "plan", "dtype": "string"}, {"name": "reasoning", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "data", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24204115, "num_examples": 6018}], "download_size": 2084270, "dataset_size": 24204115}} | 2023-12-12T01:18:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rapidapi-example-responses-workflow-steps"
More Information needed | [
"# Dataset Card for \"rapidapi-example-responses-workflow-steps\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rapidapi-example-responses-workflow-steps\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rapidapi-example-responses-workflow-steps\"\n\nMore Information needed"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.