sha stringlengths 40 40 | text stringlengths 1 13.4M | id stringlengths 2 117 | tags listlengths 1 7.91k | created_at stringlengths 25 25 | metadata stringlengths 2 875k | last_modified stringlengths 25 25 | arxiv listlengths 0 25 | languages listlengths 0 7.91k | tags_str stringlengths 17 159k | text_str stringlengths 1 447k | text_lists listlengths 0 352 | processed_texts listlengths 1 353 | tokens_length listlengths 1 353 | input_texts listlengths 1 40 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8e7bc0003ff4e7a62e46a017a373c9475a3af232 | # [doc] image dataset 7
This dataset contains 2 jpeg files in the `red` directory and 2 jpeg files in the `green` directory. | polinaeterna/doc-image-7 | [
"size_categories:n<1K",
"region:us"
] | 2023-12-04T17:26:12+00:00 | {"size_categories": ["n<1K"], "configs": [{"config_name": "default", "drop_labels": false, "drop_metadata": false}]} | 2023-12-04T17:33:22+00:00 | [] | [] | TAGS
#size_categories-n<1K #region-us
| # [doc] image dataset 7
This dataset contains 2 jpeg files in the 'red' directory and 2 jpeg files in the 'green' directory. | [
"# [doc] image dataset 7\n\nThis dataset contains 2 jpeg files in the 'red' directory and 2 jpeg files in the 'green' directory."
] | [
"TAGS\n#size_categories-n<1K #region-us \n",
"# [doc] image dataset 7\n\nThis dataset contains 2 jpeg files in the 'red' directory and 2 jpeg files in the 'green' directory."
] | [
16,
37
] | [
"passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] image dataset 7\n\nThis dataset contains 2 jpeg files in the 'red' directory and 2 jpeg files in the 'green' directory."
] |
400ad1089837eb8bae19ac5c3286ae8f75004d38 |
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-7B](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:43:16.068019](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B/blob/main/results_2023-12-04T17-43-16.068019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6408628483411186,
"acc_stderr": 0.03219894205377224,
"acc_norm": 0.6438076616391943,
"acc_norm_stderr": 0.032835537242155946,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5692137581021863,
"mc2_stderr": 0.015366764842114067
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.01384746051889298
},
"harness|hellaswag|10": {
"acc": 0.6750647281418044,
"acc_stderr": 0.004673934837150448,
"acc_norm": 0.8589922326229835,
"acc_norm_stderr": 0.0034731828909689687
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562076,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562076
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421114,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421114
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669975,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669975
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5692137581021863,
"mc2_stderr": 0.015366764842114067
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.013708494995677646
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B | [
"region:us"
] | 2023-12-04T17:28:26+00:00 | {"pretty_name": "Evaluation run of openaccess-ai-collective/DPOpenHermes-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/DPOpenHermes-7B](https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:43:16.068019](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__DPOpenHermes-7B/blob/main/results_2023-12-04T17-43-16.068019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6408628483411186,\n \"acc_stderr\": 0.03219894205377224,\n \"acc_norm\": 0.6438076616391943,\n \"acc_norm_stderr\": 0.032835537242155946,\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5692137581021863,\n \"mc2_stderr\": 0.015366764842114067\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.01384746051889298\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6750647281418044,\n \"acc_stderr\": 0.004673934837150448,\n \"acc_norm\": 0.8589922326229835,\n \"acc_norm_stderr\": 0.0034731828909689687\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562076,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562076\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n \"acc_stderr\": 0.015652542496421114,\n \"acc_norm\": 0.3240223463687151,\n \"acc_norm_stderr\": 0.015652542496421114\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669975,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669975\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5692137581021863,\n \"mc2_stderr\": 0.015366764842114067\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \"acc_stderr\": 0.013708494995677646\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-25-36.018483.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-43-16.068019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["**/details_harness|winogrande|5_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["**/details_harness|winogrande|5_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-43-16.068019.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_25_36.018483", "path": ["results_2023-12-04T17-25-36.018483.parquet"]}, {"split": "2023_12_04T17_43_16.068019", "path": ["results_2023-12-04T17-43-16.068019.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-43-16.068019.parquet"]}]}]} | 2023-12-04T17:46:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openaccess-ai-collective/DPOpenHermes-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T17:43:16.068019(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation ru... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/DPOpenHermes-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... |
0346406d108d77fe5f8faa36d5fa3474c5d67154 | # Dataset Card for "t5_small_test_set_context_len_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | yardeny/t5_small_test_set_context_len_512 | [
"region:us"
] | 2023-12-04T17:31:26+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 410880, "num_examples": 160}], "download_size": 178658, "dataset_size": 410880}} | 2023-12-04T17:31:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "t5_small_test_set_context_len_512"
More Information needed | [
"# Dataset Card for \"t5_small_test_set_context_len_512\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"t5_small_test_set_context_len_512\"\n\nMore Information needed"
] | [
6,
27
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"t5_small_test_set_context_len_512\"\n\nMore Information needed"
] |
2f9e95b8dfb1214e5ae3d60c81753fb6ea9348f8 |
# Dataset Card for Evaluation run of beberik/Nyxene-11B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beberik/Nyxene-11B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beberik/Nyxene-11B](https://huggingface.co/beberik/Nyxene-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:29:47.826048](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-11B/blob/main/results_2023-12-04T17-29-47.826048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651058700269482,
"acc_stderr": 0.0320152029210293,
"acc_norm": 0.6547280296636943,
"acc_norm_stderr": 0.03264773034799592,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5749990941717074,
"mc2_stderr": 0.015569738564249067
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068075
},
"harness|hellaswag|10": {
"acc": 0.6625174268074089,
"acc_stderr": 0.004718846448021786,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062951024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305528,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305528
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678492,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678492
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740546,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740546
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5749990941717074,
"mc2_stderr": 0.015569738564249067
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881575
},
"harness|gsm8k|5": {
"acc": 0.5178165276724791,
"acc_stderr": 0.01376373837986793
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_beberik__Nyxene-11B | [
"region:us"
] | 2023-12-04T17:32:40+00:00 | {"pretty_name": "Evaluation run of beberik/Nyxene-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beberik/Nyxene-11B](https://huggingface.co/beberik/Nyxene-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:29:47.826048](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-11B/blob/main/results_2023-12-04T17-29-47.826048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651058700269482,\n \"acc_stderr\": 0.0320152029210293,\n \"acc_norm\": 0.6547280296636943,\n \"acc_norm_stderr\": 0.03264773034799592,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5749990941717074,\n \"mc2_stderr\": 0.015569738564249067\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620199,\n \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068075\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6625174268074089,\n \"acc_stderr\": 0.004718846448021786,\n \"acc_norm\": 0.8454491137223661,\n \"acc_norm_stderr\": 0.0036073726062951024\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.016232826818678492,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.016232826818678492\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740546,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740546\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5749990941717074,\n \"mc2_stderr\": 0.015569738564249067\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881575\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5178165276724791,\n \"acc_stderr\": 0.01376373837986793\n }\n}\n```", "repo_url": "https://huggingface.co/beberik/Nyxene-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["**/details_harness|winogrande|5_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-29-47.826048.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_29_47.826048", "path": ["results_2023-12-04T17-29-47.826048.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-29-47.826048.parquet"]}]}]} | 2023-12-04T17:33:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of beberik/Nyxene-11B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model beberik/Nyxene-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T17:29:47.826048(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of beberik/Nyxene-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-11B on the Open LLM L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beberik/Nyxene-11B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyx... | [
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beberik/Nyxene-11B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-11B o... |
8443093e6685aae323f0838b6b1a7ffc484a6f22 |
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/Deacon-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-1b](https://huggingface.co/KnutJaegersberg/Deacon-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deacon-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:32:52.596072](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-1b/blob/main/results_2023-12-04T17-32-52.596072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2547551700512293,
"acc_stderr": 0.030605522190513053,
"acc_norm": 0.2559364936006559,
"acc_norm_stderr": 0.03137480856769965,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.01445084671412389,
"mc2": 0.35049035383875937,
"mc2_stderr": 0.014299155547047497
},
"harness|arc:challenge|25": {
"acc": 0.3003412969283277,
"acc_stderr": 0.013395909309957004,
"acc_norm": 0.3242320819112628,
"acc_norm_stderr": 0.013678810399518827
},
"harness|hellaswag|10": {
"acc": 0.44722166899024096,
"acc_stderr": 0.004961904949171387,
"acc_norm": 0.5862378012348137,
"acc_norm_stderr": 0.004915003499517835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.030167533468632702,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.030167533468632702
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749884,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749884
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818114,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818114
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752937,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752937
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213796,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.02564947026588919,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.02564947026588919
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1761467889908257,
"acc_stderr": 0.016332882393431378,
"acc_norm": 0.1761467889908257,
"acc_norm_stderr": 0.016332882393431378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646036,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646036
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.03011821010694266,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.03011821010694266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262206,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262206
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.01111171533610113,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.01111171533610113
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378988,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378988
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.01445084671412389,
"mc2": 0.35049035383875937,
"mc2_stderr": 0.014299155547047497
},
"harness|winogrande|5": {
"acc": 0.595895816890292,
"acc_stderr": 0.013791610664670849
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.002267537102254515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deacon-1b | [
"region:us"
] | 2023-12-04T17:35:06+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deacon-1b", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-1b](https://huggingface.co/KnutJaegersberg/Deacon-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deacon-1b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:32:52.596072](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-1b/blob/main/results_2023-12-04T17-32-52.596072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2547551700512293,\n \"acc_stderr\": 0.030605522190513053,\n \"acc_norm\": 0.2559364936006559,\n \"acc_norm_stderr\": 0.03137480856769965,\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.01445084671412389,\n \"mc2\": 0.35049035383875937,\n \"mc2_stderr\": 0.014299155547047497\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3003412969283277,\n \"acc_stderr\": 0.013395909309957004,\n \"acc_norm\": 0.3242320819112628,\n \"acc_norm_stderr\": 0.013678810399518827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44722166899024096,\n \"acc_stderr\": 0.004961904949171387,\n \"acc_norm\": 0.5862378012348137,\n \"acc_norm_stderr\": 0.004915003499517835\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.030167533468632702,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.030167533468632702\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749884,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749884\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818114,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818114\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752937,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752937\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213796,\n \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.02564947026588919,\n \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.02564947026588919\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1761467889908257,\n \"acc_stderr\": 0.016332882393431378,\n \"acc_norm\": 0.1761467889908257,\n \"acc_norm_stderr\": 0.016332882393431378\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646036,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646036\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n \"acc_stderr\": 0.03011821010694266,\n \"acc_norm\": 0.3034188034188034,\n \"acc_norm_stderr\": 0.03011821010694266\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.27330779054916987,\n \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262206,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262206\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n \"acc_stderr\": 0.01111171533610113,\n \"acc_norm\": 0.25358539765319427,\n \"acc_norm_stderr\": 0.01111171533610113\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681404,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681404\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378988,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378988\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.01445084671412389,\n \"mc2\": 0.35049035383875937,\n \"mc2_stderr\": 0.014299155547047497\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.595895816890292,\n \"acc_stderr\": 0.013791610664670849\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.002267537102254515\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deacon-1b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-32-52.596072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["**/details_harness|winogrande|5_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-32-52.596072.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_32_52.596072", "path": ["results_2023-12-04T17-32-52.596072.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-32-52.596072.parquet"]}]}]} | 2023-12-04T17:35:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-1b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-1b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T17:32:52.596072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-1b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-1b on ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-1b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Knut... | [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-1b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersbe... |
2e235a4d9361599cf623015a01bcac1ab90a8bd0 |
# Dataset Card for Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/S4sch/zephyr-neural-chat-frankenmerge11b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [S4sch/zephyr-neural-chat-frankenmerge11b](https://huggingface.co/S4sch/zephyr-neural-chat-frankenmerge11b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_S4sch__zephyr-neural-chat-frankenmerge11b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:40:46.451568](https://huggingface.co/datasets/open-llm-leaderboard/details_S4sch__zephyr-neural-chat-frankenmerge11b/blob/main/results_2023-12-04T17-40-46.451568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6090298979840253,
"acc_stderr": 0.032809646949895625,
"acc_norm": 0.618940969899117,
"acc_norm_stderr": 0.033576053409858746,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6062876441761156,
"mc2_stderr": 0.0158161206163554
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009124,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251109
},
"harness|hellaswag|10": {
"acc": 0.6596295558653654,
"acc_stderr": 0.004728653488866922,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.0036505121583062794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.02989060968628665,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.02989060968628665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518721,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518721
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343136,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343136
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862538,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862538
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400175,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400175
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.01271994954303221,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.01271994954303221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.03063565515038764,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.03063565515038764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623328,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623328
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6062876441761156,
"mc2_stderr": 0.0158161206163554
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803159
},
"harness|gsm8k|5": {
"acc": 0.07429871114480667,
"acc_stderr": 0.007223844172845574
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_S4sch__zephyr-neural-chat-frankenmerge11b | [
"region:us"
] | 2023-12-04T17:43:36+00:00 | {"pretty_name": "Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b", "dataset_summary": "Dataset automatically created during the evaluation run of model [S4sch/zephyr-neural-chat-frankenmerge11b](https://huggingface.co/S4sch/zephyr-neural-chat-frankenmerge11b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_S4sch__zephyr-neural-chat-frankenmerge11b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:40:46.451568](https://huggingface.co/datasets/open-llm-leaderboard/details_S4sch__zephyr-neural-chat-frankenmerge11b/blob/main/results_2023-12-04T17-40-46.451568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6090298979840253,\n \"acc_stderr\": 0.032809646949895625,\n \"acc_norm\": 0.618940969899117,\n \"acc_norm_stderr\": 0.033576053409858746,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6062876441761156,\n \"mc2_stderr\": 0.0158161206163554\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009124,\n \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251109\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6596295558653654,\n \"acc_stderr\": 0.004728653488866922,\n \"acc_norm\": 0.8408683529177454,\n \"acc_norm_stderr\": 0.0036505121583062794\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.02989060968628665,\n \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.02989060968628665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518721,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518721\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343136,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343136\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862538,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862538\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400175,\n \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400175\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.01271994954303221,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.01271994954303221\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.029465133639776132,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.029465133639776132\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.03063565515038764,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.03063565515038764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623328,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623328\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6062876441761156,\n \"mc2_stderr\": 0.0158161206163554\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803159\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \"acc_stderr\": 0.007223844172845574\n }\n}\n```", "repo_url": "https://huggingface.co/S4sch/zephyr-neural-chat-frankenmerge11b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-40-46.451568.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["**/details_harness|winogrande|5_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-40-46.451568.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_40_46.451568", "path": ["results_2023-12-04T17-40-46.451568.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-40-46.451568.parquet"]}]}]} | 2023-12-04T17:44:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model S4sch/zephyr-neural-chat-frankenmerge11b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T17:40:46.451568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model S4sch/zephyr-n... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation ru... | [
6,
27,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of S4sch/zephyr-neural-chat-frankenmerge11b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... |
bf99f7d2b01019d50bd533088f7bafd4063b403a |
# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mrfakename/NeuralOrca-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mrfakename/NeuralOrca-7B-v1](https://huggingface.co/mrfakename/NeuralOrca-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:53:31.960115](https://huggingface.co/datasets/open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1/blob/main/results_2023-12-04T17-53-31.960115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6385330990221446,
"acc_stderr": 0.032248165389573695,
"acc_norm": 0.6406523603337572,
"acc_norm_stderr": 0.032892154968215216,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5457774305208005,
"mc2_stderr": 0.015413416681633433
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349814,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.01391303452962045
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337136,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.0035562912320503525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666789,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666789
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5457774305208005,
"mc2_stderr": 0.015413416681633433
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249784
},
"harness|gsm8k|5": {
"acc": 0.5845337376800607,
"acc_stderr": 0.013574222625031811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1 | [
"region:us"
] | 2023-12-04T17:56:24+00:00 | {"pretty_name": "Evaluation run of mrfakename/NeuralOrca-7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mrfakename/NeuralOrca-7B-v1](https://huggingface.co/mrfakename/NeuralOrca-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:53:31.960115](https://huggingface.co/datasets/open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1/blob/main/results_2023-12-04T17-53-31.960115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6385330990221446,\n \"acc_stderr\": 0.032248165389573695,\n \"acc_norm\": 0.6406523603337572,\n \"acc_norm_stderr\": 0.032892154968215216,\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5457774305208005,\n \"mc2_stderr\": 0.015413416681633433\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349814,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.01391303452962045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337136,\n \"acc_norm\": 0.8507269468233419,\n \"acc_norm_stderr\": 0.0035562912320503525\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666789,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666789\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5457774305208005,\n \"mc2_stderr\": 0.015413416681633433\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249784\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5845337376800607,\n \"acc_stderr\": 0.013574222625031811\n }\n}\n```", "repo_url": "https://huggingface.co/mrfakename/NeuralOrca-7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["**/details_harness|winogrande|5_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-53-31.960115.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_53_31.960115", "path": ["results_2023-12-04T17-53-31.960115.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-53-31.960115.parquet"]}]}]} | 2023-12-04T17:57:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mrfakename/NeuralOrca-7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T17:53:31.960115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mrfakename/NeuralOrca-7B-v1... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mr... | [
6,
23,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mrfakename/... |
50d7f2698d1105a0fe1e7630590f12a691b70a66 |
# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-SFT-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Felladrin/TinyMistral-248M-SFT-v3](https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__TinyMistral-248M-SFT-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:03:12.401261](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__TinyMistral-248M-SFT-v3/blob/main/results_2023-12-04T18-03-12.401261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23016202481388653,
"acc_stderr": 0.029832125302523167,
"acc_norm": 0.22987279360507185,
"acc_norm_stderr": 0.03061582219263556,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460962,
"mc2": 0.400307198899101,
"mc2_stderr": 0.014941622020470767
},
"harness|arc:challenge|25": {
"acc": 0.19283276450511946,
"acc_stderr": 0.01152905546566333,
"acc_norm": 0.21928327645051193,
"acc_norm_stderr": 0.012091245787615721
},
"harness|hellaswag|10": {
"acc": 0.27106154152559253,
"acc_stderr": 0.004435993492583857,
"acc_norm": 0.2826130252937662,
"acc_norm_stderr": 0.004493495872000123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108618,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108618
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.18783068783068782,
"acc_stderr": 0.0201157341415211,
"acc_norm": 0.18783068783068782,
"acc_norm_stderr": 0.0201157341415211
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267836,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.026273086047535418,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.026273086047535418
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19170984455958548,
"acc_stderr": 0.028408953626245296,
"acc_norm": 0.19170984455958548,
"acc_norm_stderr": 0.028408953626245296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936104,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936104
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955917,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955917
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398682,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574918,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480778,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480778
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543332,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348788,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348788
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559324,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559324
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007633,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460962,
"mc2": 0.400307198899101,
"mc2_stderr": 0.014941622020470767
},
"harness|winogrande|5": {
"acc": 0.5153906866614049,
"acc_stderr": 0.014045826789783656
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Felladrin__TinyMistral-248M-SFT-v3 | [
"region:us"
] | 2023-12-04T18:06:04+00:00 | {"pretty_name": "Evaluation run of Felladrin/TinyMistral-248M-SFT-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Felladrin/TinyMistral-248M-SFT-v3](https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__TinyMistral-248M-SFT-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:03:12.401261](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__TinyMistral-248M-SFT-v3/blob/main/results_2023-12-04T18-03-12.401261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23016202481388653,\n \"acc_stderr\": 0.029832125302523167,\n \"acc_norm\": 0.22987279360507185,\n \"acc_norm_stderr\": 0.03061582219263556,\n \"mc1\": 0.20563035495716034,\n \"mc1_stderr\": 0.014148482219460962,\n \"mc2\": 0.400307198899101,\n \"mc2_stderr\": 0.014941622020470767\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19283276450511946,\n \"acc_stderr\": 0.01152905546566333,\n \"acc_norm\": 0.21928327645051193,\n \"acc_norm_stderr\": 0.012091245787615721\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27106154152559253,\n \"acc_stderr\": 0.004435993492583857,\n \"acc_norm\": 0.2826130252937662,\n \"acc_norm_stderr\": 0.004493495872000123\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108618,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108618\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.18783068783068782,\n \"acc_stderr\": 0.0201157341415211,\n \"acc_norm\": 0.18783068783068782,\n \"acc_norm_stderr\": 0.0201157341415211\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.026273086047535418,\n \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.026273086047535418\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.02655220782821529,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.02655220782821529\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19170984455958548,\n \"acc_stderr\": 0.028408953626245296,\n \"acc_norm\": 0.19170984455958548,\n \"acc_norm_stderr\": 0.028408953626245296\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936104,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936104\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955917,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955917\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n \"acc_stderr\": 0.015246803197398682,\n \"acc_norm\": 0.2388250319284802,\n \"acc_norm_stderr\": 0.015246803197398682\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574918,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.022122439772480778,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.022122439772480778\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543332,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n \"acc_stderr\": 0.010946570966348788,\n \"acc_norm\": 0.242503259452412,\n \"acc_norm_stderr\": 0.010946570966348788\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559324,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559324\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007633,\n \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n \"mc1_stderr\": 0.014148482219460962,\n \"mc2\": 0.400307198899101,\n \"mc2_stderr\": 0.014941622020470767\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.014045826789783656\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-03-12.401261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["**/details_harness|winogrande|5_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-03-12.401261.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_03_12.401261", "path": ["results_2023-12-04T18-03-12.401261.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-03-12.401261.parquet"]}]}]} | 2023-12-04T18:06:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-SFT-v3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Felladrin/TinyMistral-248M-SFT-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:03:12.401261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-SFT-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Felladrin/TinyMistral... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-SFT-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mo... | [
6,
26,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Felladrin/TinyMistral-248M-SFT-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fella... |
2e278713b888d03cadbadeabd551273e0d63d78b |
# Dataset Card for Evaluation run of chargoddard/loyal-piano-m7-cdpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/loyal-piano-m7-cdpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/loyal-piano-m7-cdpo](https://huggingface.co/chargoddard/loyal-piano-m7-cdpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__loyal-piano-m7-cdpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:22:08.333315](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__loyal-piano-m7-cdpo/blob/main/results_2023-12-04T19-22-08.333315.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6467300740498503,
"acc_stderr": 0.0321481764536421,
"acc_norm": 0.6493966251003455,
"acc_norm_stderr": 0.03278779694207035,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6154444740630072,
"mc2_stderr": 0.015471878904856169
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859857,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635473
},
"harness|hellaswag|10": {
"acc": 0.6652061342362079,
"acc_stderr": 0.00470953886491632,
"acc_norm": 0.8542123083051185,
"acc_norm_stderr": 0.0035217202839105555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.04115324610336953,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.04115324610336953
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432104,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432104
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364423,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364423
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.01651959427529712,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.01651959427529712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734808,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734808
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6154444740630072,
"mc2_stderr": 0.015471878904856169
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881573
},
"harness|gsm8k|5": {
"acc": 0.5633055344958302,
"acc_stderr": 0.013661649780905493
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_chargoddard__loyal-piano-m7-cdpo | [
"region:us"
] | 2023-12-04T18:09:37+00:00 | {"pretty_name": "Evaluation run of chargoddard/loyal-piano-m7-cdpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/loyal-piano-m7-cdpo](https://huggingface.co/chargoddard/loyal-piano-m7-cdpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__loyal-piano-m7-cdpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:22:08.333315](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__loyal-piano-m7-cdpo/blob/main/results_2023-12-04T19-22-08.333315.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6467300740498503,\n \"acc_stderr\": 0.0321481764536421,\n \"acc_norm\": 0.6493966251003455,\n \"acc_norm_stderr\": 0.03278779694207035,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6154444740630072,\n \"mc2_stderr\": 0.015471878904856169\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859857,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635473\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6652061342362079,\n \"acc_stderr\": 0.00470953886491632,\n \"acc_norm\": 0.8542123083051185,\n \"acc_norm_stderr\": 0.0035217202839105555\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.04115324610336953,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.04115324610336953\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432104,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432104\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364423,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364423\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.01651959427529712,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.01651959427529712\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n \"acc_stderr\": 0.012716941720734808,\n \"acc_norm\": 0.45436766623207303,\n \"acc_norm_stderr\": 0.012716941720734808\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6154444740630072,\n \"mc2_stderr\": 0.015471878904856169\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881573\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5633055344958302,\n \"acc_stderr\": 0.013661649780905493\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/loyal-piano-m7-cdpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-06-46.796390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-22-08.333315.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["**/details_harness|winogrande|5_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["**/details_harness|winogrande|5_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-22-08.333315.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_06_46.796390", "path": ["results_2023-12-04T18-06-46.796390.parquet"]}, {"split": "2023_12_04T19_22_08.333315", "path": ["results_2023-12-04T19-22-08.333315.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-22-08.333315.parquet"]}]}]} | 2023-12-04T19:25:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/loyal-piano-m7-cdpo
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model chargoddard/loyal-piano-m7-cdpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:22:08.333315(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of chargoddard/loyal-piano-m7-cdpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/loyal-piano... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/loyal-piano-m7-cdpo",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/loyal-piano-m7-cdpo## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargod... |
a1c0a9dce00b17a3139708bcca58516e2994f9e9 |
# Dataset Card for Evaluation run of qiyinmiss/My_GPT2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/qiyinmiss/My_GPT2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [qiyinmiss/My_GPT2](https://huggingface.co/qiyinmiss/My_GPT2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qiyinmiss__My_GPT2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:10:51.654289](https://huggingface.co/datasets/open-llm-leaderboard/details_qiyinmiss__My_GPT2/blob/main/results_2023-12-04T18-10-51.654289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2578870030806924,
"acc_stderr": 0.030639295152135662,
"acc_norm": 0.2586921012373598,
"acc_norm_stderr": 0.031410248419889694,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4073203809998297,
"mc2_stderr": 0.014931113118872399
},
"harness|arc:challenge|25": {
"acc": 0.19880546075085323,
"acc_stderr": 0.011662850198175539,
"acc_norm": 0.21928327645051193,
"acc_norm_stderr": 0.012091245787615723
},
"harness|hellaswag|10": {
"acc": 0.29267078271260705,
"acc_stderr": 0.004540586983229993,
"acc_norm": 0.3158733320055766,
"acc_norm_stderr": 0.004639126951051421
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756194,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628834,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.0312984318574381,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.0312984318574381
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.3,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.022752388839776826,
"acc_norm": 0.2794871794871795,
"acc_norm_stderr": 0.022752388839776826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1752136752136752,
"acc_stderr": 0.02490443909891822,
"acc_norm": 0.1752136752136752,
"acc_norm_stderr": 0.02490443909891822
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21839080459770116,
"acc_stderr": 0.01477435831993449,
"acc_norm": 0.21839080459770116,
"acc_norm_stderr": 0.01477435831993449
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443737,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913222,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4073203809998297,
"mc2_stderr": 0.014931113118872399
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790513
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544736
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_qiyinmiss__My_GPT2 | [
"region:us"
] | 2023-12-04T18:12:28+00:00 | {"pretty_name": "Evaluation run of qiyinmiss/My_GPT2", "dataset_summary": "Dataset automatically created during the evaluation run of model [qiyinmiss/My_GPT2](https://huggingface.co/qiyinmiss/My_GPT2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qiyinmiss__My_GPT2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:10:51.654289](https://huggingface.co/datasets/open-llm-leaderboard/details_qiyinmiss__My_GPT2/blob/main/results_2023-12-04T18-10-51.654289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2578870030806924,\n \"acc_stderr\": 0.030639295152135662,\n \"acc_norm\": 0.2586921012373598,\n \"acc_norm_stderr\": 0.031410248419889694,\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4073203809998297,\n \"mc2_stderr\": 0.014931113118872399\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19880546075085323,\n \"acc_stderr\": 0.011662850198175539,\n \"acc_norm\": 0.21928327645051193,\n \"acc_norm_stderr\": 0.012091245787615723\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29267078271260705,\n \"acc_stderr\": 0.004540586983229993,\n \"acc_norm\": 0.3158733320055766,\n \"acc_norm_stderr\": 0.004639126951051421\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756194,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756194\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628834,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.0312984318574381,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.0312984318574381\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.022752388839776826,\n \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.022752388839776826\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352168,\n \"acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352168\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173,\n \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097173\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1752136752136752,\n \"acc_stderr\": 0.02490443909891822,\n \"acc_norm\": 0.1752136752136752,\n \"acc_norm_stderr\": 0.02490443909891822\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21839080459770116,\n \"acc_stderr\": 0.01477435831993449,\n \"acc_norm\": 0.21839080459770116,\n \"acc_norm_stderr\": 0.01477435831993449\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913222,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913222\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4073203809998297,\n \"mc2_stderr\": 0.014931113118872399\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790513\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544736\n }\n}\n```", "repo_url": "https://huggingface.co/qiyinmiss/My_GPT2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-10-51.654289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["**/details_harness|winogrande|5_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-10-51.654289.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_10_51.654289", "path": ["results_2023-12-04T18-10-51.654289.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-10-51.654289.parquet"]}]}]} | 2023-12-04T18:13:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qiyinmiss/My_GPT2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model qiyinmiss/My_GPT2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:10:51.654289(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of qiyinmiss/My_GPT2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model qiyinmiss/My_GPT2 on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qiyinmiss/My_GPT2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model qiyinmiss/My... | [
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of qiyinmiss/My_GPT2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model qiyinmiss/My_GPT2 on ... |
923794aca621c16b1fd1a79148103ff36d050c7e |
# Dataset Card for Evaluation run of L-R/LLmRa-1.3B_V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/L-R/LLmRa-1.3B_V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [L-R/LLmRa-1.3B_V2](https://huggingface.co/L-R/LLmRa-1.3B_V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_L-R__LLmRa-1.3B_V2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:13:15.714207](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B_V2/blob/main/results_2023-12-04T18-13-15.714207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2645874461578066,
"acc_stderr": 0.03098158394706952,
"acc_norm": 0.26593270237799466,
"acc_norm_stderr": 0.031804165627672396,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3646013754071996,
"mc2_stderr": 0.014251642555151921
},
"harness|arc:challenge|25": {
"acc": 0.2815699658703072,
"acc_stderr": 0.013143376735009009,
"acc_norm": 0.3046075085324232,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.41037641904003186,
"acc_stderr": 0.004908967278222486,
"acc_norm": 0.5302728540131448,
"acc_norm_stderr": 0.004980627287147582
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493854,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493854
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062949,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062949
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.17446808510638298,
"acc_stderr": 0.024809442335503976,
"acc_norm": 0.17446808510638298,
"acc_norm_stderr": 0.024809442335503976
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885193,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548298,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548298
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026867,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026867
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2773109243697479,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.2773109243697479,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30825688073394497,
"acc_stderr": 0.019798366698367275,
"acc_norm": 0.30825688073394497,
"acc_norm_stderr": 0.019798366698367275
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674096,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674096
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.0273034845990694,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.0273034845990694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596917,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596917
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.03952301967702511,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.03952301967702511
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.030118210106942662,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.030118210106942662
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2247765006385696,
"acc_stderr": 0.014927447101937169,
"acc_norm": 0.2247765006385696,
"acc_norm_stderr": 0.014927447101937169
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23663624511082137,
"acc_stderr": 0.010855137351572732,
"acc_norm": 0.23663624511082137,
"acc_norm_stderr": 0.010855137351572732
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528047,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528047
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399697,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399697
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.031524391865554044,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.031524391865554044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629919,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629919
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3646013754071996,
"mc2_stderr": 0.014251642555151921
},
"harness|winogrande|5": {
"acc": 0.5927387529597474,
"acc_stderr": 0.013808654122417847
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_L-R__LLmRa-1.3B_V2 | [
"region:us"
] | 2023-12-04T18:15:22+00:00 | {"pretty_name": "Evaluation run of L-R/LLmRa-1.3B_V2", "dataset_summary": "Dataset automatically created during the evaluation run of model [L-R/LLmRa-1.3B_V2](https://huggingface.co/L-R/LLmRa-1.3B_V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_L-R__LLmRa-1.3B_V2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:13:15.714207](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B_V2/blob/main/results_2023-12-04T18-13-15.714207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2645874461578066,\n \"acc_stderr\": 0.03098158394706952,\n \"acc_norm\": 0.26593270237799466,\n \"acc_norm_stderr\": 0.031804165627672396,\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3646013754071996,\n \"mc2_stderr\": 0.014251642555151921\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2815699658703072,\n \"acc_stderr\": 0.013143376735009009,\n \"acc_norm\": 0.3046075085324232,\n \"acc_norm_stderr\": 0.01344952210993249\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41037641904003186,\n \"acc_stderr\": 0.004908967278222486,\n \"acc_norm\": 0.5302728540131448,\n \"acc_norm_stderr\": 0.004980627287147582\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493854,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493854\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062949,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062949\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.17446808510638298,\n \"acc_stderr\": 0.024809442335503976,\n \"acc_norm\": 0.17446808510638298,\n \"acc_norm_stderr\": 0.024809442335503976\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885193,\n \"acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548298,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548298\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026867,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026867\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2773109243697479,\n \"acc_stderr\": 0.02907937453948001,\n \"acc_norm\": 0.2773109243697479,\n \"acc_norm_stderr\": 0.02907937453948001\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30825688073394497,\n \"acc_stderr\": 0.019798366698367275,\n \"acc_norm\": 0.30825688073394497,\n \"acc_norm_stderr\": 0.019798366698367275\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674096,\n \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674096\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22784810126582278,\n \"acc_stderr\": 0.0273034845990694,\n \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.0273034845990694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.2062780269058296,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596917,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596917\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.03952301967702511,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.03952301967702511\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n \"acc_stderr\": 0.030118210106942662,\n \"acc_norm\": 0.3034188034188034,\n \"acc_norm_stderr\": 0.030118210106942662\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2247765006385696,\n \"acc_stderr\": 0.014927447101937169,\n \"acc_norm\": 0.2247765006385696,\n \"acc_norm_stderr\": 0.014927447101937169\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757177,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.2540192926045016,\n \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23663624511082137,\n \"acc_stderr\": 0.010855137351572732,\n \"acc_norm\": 0.23663624511082137,\n \"acc_norm_stderr\": 0.010855137351572732\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528047,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528047\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399697,\n \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399697\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.031524391865554044,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.031524391865554044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.03175554786629919,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.03175554786629919\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3646013754071996,\n \"mc2_stderr\": 0.014251642555151921\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5927387529597474,\n \"acc_stderr\": 0.013808654122417847\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/L-R/LLmRa-1.3B_V2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-13-15.714207.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["**/details_harness|winogrande|5_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-13-15.714207.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_13_15.714207", "path": ["results_2023-12-04T18-13-15.714207.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-13-15.714207.parquet"]}]}]} | 2023-12-04T18:16:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of L-R/LLmRa-1.3B_V2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model L-R/LLmRa-1.3B_V2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:13:15.714207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of L-R/LLmRa-1.3B_V2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model L-R/LLmRa-1.3B_V2 on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of L-R/LLmRa-1.3B_V2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model L-R/LLmRa-1.... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of L-R/LLmRa-1.3B_V2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model L-R/LLmRa-1.3B_V2 on ... |
a3ad2ae72a8d4a1a83e62f8b5a8f9a5a175bd614 |
# Dataset Card for Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B](https://huggingface.co/Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__neural-chat-7b-v3-1-OpenHermes-2.5-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:24:21.614365](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__neural-chat-7b-v3-1-OpenHermes-2.5-7B/blob/main/results_2023-12-04T18-24-21.614365.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6331720228661488,
"acc_stderr": 0.03245949446776424,
"acc_norm": 0.636248593036376,
"acc_norm_stderr": 0.033106781744445986,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.612310037106096,
"mc2_stderr": 0.015369020754133529
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.01398303690409409,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6573391754630552,
"acc_stderr": 0.004736292355716402,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.003650512158306273
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.038448761397852714,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.038448761397852714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281386,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381392,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565104,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565104
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233254,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233254
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.01743728095318369,
"mc2": 0.612310037106096,
"mc2_stderr": 0.015369020754133529
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774104
},
"harness|gsm8k|5": {
"acc": 0.5087187263078089,
"acc_stderr": 0.01377039069700212
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__neural-chat-7b-v3-1-OpenHermes-2.5-7B | [
"region:us"
] | 2023-12-04T18:27:12+00:00 | {"pretty_name": "Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B](https://huggingface.co/Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__neural-chat-7b-v3-1-OpenHermes-2.5-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:24:21.614365](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__neural-chat-7b-v3-1-OpenHermes-2.5-7B/blob/main/results_2023-12-04T18-24-21.614365.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6331720228661488,\n \"acc_stderr\": 0.03245949446776424,\n \"acc_norm\": 0.636248593036376,\n \"acc_norm_stderr\": 0.033106781744445986,\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.612310037106096,\n \"mc2_stderr\": 0.015369020754133529\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.01398303690409409,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6573391754630552,\n \"acc_stderr\": 0.004736292355716402,\n \"acc_norm\": 0.8408683529177454,\n \"acc_norm_stderr\": 0.003650512158306273\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281386,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281386\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381392,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381392\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n \"acc_stderr\": 0.012705721498565104,\n \"acc_norm\": 0.4498044328552803,\n \"acc_norm_stderr\": 0.012705721498565104\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233254,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233254\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.612310037106096,\n \"mc2_stderr\": 0.015369020754133529\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5087187263078089,\n \"acc_stderr\": 0.01377039069700212\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-24-21.614365.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["**/details_harness|winogrande|5_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-24-21.614365.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_24_21.614365", "path": ["results_2023-12-04T18-24-21.614365.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-24-21.614365.parquet"]}]}]} | 2023-12-04T18:28:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:24:21.614365(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/neu... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluatio... | [
6,
30,
31,
179,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/neural-chat-7b-v3-1-OpenHermes-2.5-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of ... |
71334aa1d5ab17dac5b9e24866dc81019a49ac4a |
# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [simonveitner/MathHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:25:11.977949](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T18-25-11.977949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6357052985064087,
"acc_stderr": 0.03227227710982547,
"acc_norm": 0.6396287253937496,
"acc_norm_stderr": 0.032910368232277956,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.519509607840464,
"mc2_stderr": 0.015313445088017108
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.01426963463567073,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598677
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871871,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.0036412262941678012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.01531825774597671,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.01531825774597671
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.519509607840464,
"mc2_stderr": 0.015313445088017108
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205191
},
"harness|gsm8k|5": {
"acc": 0.4927975739196361,
"acc_stderr": 0.013771055751972868
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B | [
"region:us"
] | 2023-12-04T18:28:03+00:00 | {"pretty_name": "Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [simonveitner/MathHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:25:11.977949](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__MathHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T18-25-11.977949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6357052985064087,\n \"acc_stderr\": 0.03227227710982547,\n \"acc_norm\": 0.6396287253937496,\n \"acc_norm_stderr\": 0.032910368232277956,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.519509607840464,\n \"mc2_stderr\": 0.015313445088017108\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.01426963463567073,\n \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598677\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n \"acc_stderr\": 0.004752158936871871,\n \"acc_norm\": 0.8418641704839673,\n \"acc_norm_stderr\": 0.0036412262941678012\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n \"acc_stderr\": 0.01531825774597671,\n \"acc_norm\": 0.2994413407821229,\n \"acc_norm_stderr\": 0.01531825774597671\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.519509607840464,\n \"mc2_stderr\": 0.015313445088017108\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205191\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4927975739196361,\n \"acc_stderr\": 0.013771055751972868\n }\n}\n```", "repo_url": "https://huggingface.co/simonveitner/MathHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["**/details_harness|winogrande|5_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-25-11.977949.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_25_11.977949", "path": ["results_2023-12-04T18-25-11.977949.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-25-11.977949.parquet"]}]}]} | 2023-12-04T18:28:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model simonveitner/MathHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:25:11.977949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model simonveitner/Mat... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run ... | [
6,
24,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of simonveitner/MathHermes-2.5-Mistral-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ... |
d72a97ce3bff8c7e55dd064039a25443c511cb09 |
# Dataset Card for Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy](https://huggingface.co/vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vibhorag101__llama-2-13b-chat-hf-phr_mental_therapy",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:26:43.065214](https://huggingface.co/datasets/open-llm-leaderboard/details_vibhorag101__llama-2-13b-chat-hf-phr_mental_therapy/blob/main/results_2023-12-04T18-26-43.065214.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2434367192235923,
"acc_stderr": 0.03008501938303984,
"acc_norm": 0.24224538912156782,
"acc_norm_stderr": 0.030747150403453674,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.4692403294958895,
"mc2_stderr": 0.015061938982346217
},
"harness|arc:challenge|25": {
"acc": 0.36945392491467577,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.38822525597269625,
"acc_norm_stderr": 0.01424161420741405
},
"harness|hellaswag|10": {
"acc": 0.5696076478789086,
"acc_stderr": 0.004941191607317913,
"acc_norm": 0.7276438956383191,
"acc_norm_stderr": 0.004442623590846322
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.4692403294958895,
"mc2_stderr": 0.015061938982346217
},
"harness|winogrande|5": {
"acc": 0.6558800315706393,
"acc_stderr": 0.013352121905005941
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108261
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_vibhorag101__llama-2-13b-chat-hf-phr_mental_therapy | [
"region:us"
] | 2023-12-04T18:29:37+00:00 | {"pretty_name": "Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy", "dataset_summary": "Dataset automatically created during the evaluation run of model [vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy](https://huggingface.co/vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vibhorag101__llama-2-13b-chat-hf-phr_mental_therapy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:26:43.065214](https://huggingface.co/datasets/open-llm-leaderboard/details_vibhorag101__llama-2-13b-chat-hf-phr_mental_therapy/blob/main/results_2023-12-04T18-26-43.065214.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2434367192235923,\n \"acc_stderr\": 0.03008501938303984,\n \"acc_norm\": 0.24224538912156782,\n \"acc_norm_stderr\": 0.030747150403453674,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.4692403294958895,\n \"mc2_stderr\": 0.015061938982346217\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36945392491467577,\n \"acc_stderr\": 0.014104578366491894,\n \"acc_norm\": 0.38822525597269625,\n \"acc_norm_stderr\": 0.01424161420741405\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5696076478789086,\n \"acc_stderr\": 0.004941191607317913,\n \"acc_norm\": 0.7276438956383191,\n \"acc_norm_stderr\": 0.004442623590846322\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.4692403294958895,\n \"mc2_stderr\": 0.015061938982346217\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6558800315706393,\n \"acc_stderr\": 0.013352121905005941\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \"acc_stderr\": 0.007390654481108261\n }\n}\n```", "repo_url": "https://huggingface.co/vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-26-43.065214.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["**/details_harness|winogrande|5_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-26-43.065214.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_26_43.065214", "path": ["results_2023-12-04T18-26-43.065214.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-26-43.065214.parquet"]}]}]} | 2023-12-04T18:30:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:26:43.065214(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vibh... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the eva... | [
6,
31,
31,
180,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vibhorag101/llama-2-13b-chat-hf-phr_mental_therapy## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation r... |
9551c5efdbbf44fd2ce614eee97f2017e18a7619 |
# Dataset Card for Evaluation run of SUSTech/SUS-Chat-34B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SUSTech/SUS-Chat-34B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [SUSTech/SUS-Chat-34B](https://huggingface.co/SUSTech/SUS-Chat-34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SUSTech__SUS-Chat-34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T10:55:45.455909](https://huggingface.co/datasets/open-llm-leaderboard/details_SUSTech__SUS-Chat-34B/blob/main/results_2023-12-10T10-55-45.455909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7604055831741483,
"acc_stderr": 0.02829305624294987,
"acc_norm": 0.7636323040929514,
"acc_norm_stderr": 0.028842862208073416,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093897,
"mc2": 0.5704122295242341,
"mc2_stderr": 0.014843409183922712
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902274
},
"harness|hellaswag|10": {
"acc": 0.6400119498107947,
"acc_stderr": 0.004790155370993446,
"acc_norm": 0.8390758812985462,
"acc_norm_stderr": 0.003667099594023359
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.025648341251693605,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.025648341251693605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039783,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7116402116402116,
"acc_stderr": 0.023330654054535892,
"acc_norm": 0.7116402116402116,
"acc_norm_stderr": 0.023330654054535892
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.043758884927270585,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.043758884927270585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.01754510295165663,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.01754510295165663
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588803,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.030213340289237927,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.030213340289237927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8739495798319328,
"acc_stderr": 0.021559623121213928,
"acc_norm": 0.8739495798319328,
"acc_norm_stderr": 0.021559623121213928
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571729,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571729
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.029315962918813474,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.029315962918813474
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813235,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813235
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193068,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193068
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6826815642458101,
"acc_stderr": 0.01556639263005703,
"acc_norm": 0.6826815642458101,
"acc_norm_stderr": 0.01556639263005703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02082375883758091,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02082375883758091
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8167202572347267,
"acc_stderr": 0.021974198848265823,
"acc_norm": 0.8167202572347267,
"acc_norm_stderr": 0.021974198848265823
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654454,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.01924252622654454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6160365058670143,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.6160365058670143,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559352,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093897,
"mc2": 0.5704122295242341,
"mc2_stderr": 0.014843409183922712
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237424
},
"harness|gsm8k|5": {
"acc": 0.7217589082638363,
"acc_stderr": 0.012343803671422683
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_SUSTech__SUS-Chat-34B | [
"region:us"
] | 2023-12-04T18:30:06+00:00 | {"pretty_name": "Evaluation run of SUSTech/SUS-Chat-34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [SUSTech/SUS-Chat-34B](https://huggingface.co/SUSTech/SUS-Chat-34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SUSTech__SUS-Chat-34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T10:55:45.455909](https://huggingface.co/datasets/open-llm-leaderboard/details_SUSTech__SUS-Chat-34B/blob/main/results_2023-12-10T10-55-45.455909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7604055831741483,\n \"acc_stderr\": 0.02829305624294987,\n \"acc_norm\": 0.7636323040929514,\n \"acc_norm_stderr\": 0.028842862208073416,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5704122295242341,\n \"mc2_stderr\": 0.014843409183922712\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6400119498107947,\n \"acc_stderr\": 0.004790155370993446,\n \"acc_norm\": 0.8390758812985462,\n \"acc_norm_stderr\": 0.003667099594023359\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.025648341251693605,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.025648341251693605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039783,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039783\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7116402116402116,\n \"acc_stderr\": 0.023330654054535892,\n \"acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.023330654054535892\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n \"acc_stderr\": 0.043758884927270585,\n \"acc_norm\": 0.6031746031746031,\n \"acc_norm_stderr\": 0.043758884927270585\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588803,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237927,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237927\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8739495798319328,\n \"acc_stderr\": 0.021559623121213928,\n \"acc_norm\": 0.8739495798319328,\n \"acc_norm_stderr\": 0.021559623121213928\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571729,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571729\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.029315962918813474,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.029315962918813474\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813235,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813235\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.010461015338193068,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.010461015338193068\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6826815642458101,\n \"acc_stderr\": 0.01556639263005703,\n \"acc_norm\": 0.6826815642458101,\n \"acc_norm_stderr\": 0.01556639263005703\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02082375883758091,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02082375883758091\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8167202572347267,\n \"acc_stderr\": 0.021974198848265823,\n \"acc_norm\": 0.8167202572347267,\n \"acc_norm_stderr\": 0.021974198848265823\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654454,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.02860208586275942,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.02860208586275942\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6160365058670143,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.6160365058670143,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559352,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5704122295242341,\n \"mc2_stderr\": 0.014843409183922712\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237424\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \"acc_stderr\": 0.012343803671422683\n }\n}\n```", "repo_url": "https://huggingface.co/SUSTech/SUS-Chat-34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-27-20.173218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-10T10-55-45.455909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["**/details_harness|winogrande|5_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["**/details_harness|winogrande|5_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-10T10-55-45.455909.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_27_20.173218", "path": ["results_2023-12-04T18-27-20.173218.parquet"]}, {"split": "2023_12_10T10_55_45.455909", "path": ["results_2023-12-10T10-55-45.455909.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T10-55-45.455909.parquet"]}]}]} | 2023-12-10T10:58:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SUSTech/SUS-Chat-34B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model SUSTech/SUS-Chat-34B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-10T10:55:45.455909(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of SUSTech/SUS-Chat-34B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SUSTech/SUS-Chat-34B on the Open L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SUSTech/SUS-Chat-34B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SUSTech/S... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SUSTech/SUS-Chat-34B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model SUSTech/SUS-Chat-3... |
b3ef3922debb1fe1cecb28ac8026b55b9969f093 | # Dataset Card for "semeval-task-8-a-mono-v2-test-paraphrase"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kpriyanshu256/semeval-task-8-a-mono-v2-test-paraphrase | [
"region:us"
] | 2023-12-04T18:31:39+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "model", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "paraphrase", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 17577049, "num_examples": 5000}], "download_size": 10064093, "dataset_size": 17577049}} | 2023-12-04T18:31:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "semeval-task-8-a-mono-v2-test-paraphrase"
More Information needed | [
"# Dataset Card for \"semeval-task-8-a-mono-v2-test-paraphrase\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"semeval-task-8-a-mono-v2-test-paraphrase\"\n\nMore Information needed"
] | [
6,
29
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"semeval-task-8-a-mono-v2-test-paraphrase\"\n\nMore Information needed"
] |
3abfa80d1f1627bc4e69ffa4e76b0c0e41561ed2 |
# Dataset Card for Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/dpopenhermes-alpha-v0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/dpopenhermes-alpha-v0](https://huggingface.co/openaccess-ai-collective/dpopenhermes-alpha-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__dpopenhermes-alpha-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:36:41.738939](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__dpopenhermes-alpha-v0/blob/main/results_2023-12-04T18-36-41.738939.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6371837798967929,
"acc_stderr": 0.032261818616969314,
"acc_norm": 0.6403189407162136,
"acc_norm_stderr": 0.03290301230310095,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.5174503213318207,
"mc2_stderr": 0.014661651601621145
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790154,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158301
},
"harness|hellaswag|10": {
"acc": 0.6346345349531965,
"acc_stderr": 0.004805483767055348,
"acc_norm": 0.839573790081657,
"acc_norm_stderr": 0.003662508272330896
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642507,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106596,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476892,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476892
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.5174503213318207,
"mc2_stderr": 0.014661651601621145
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.558756633813495,
"acc_stderr": 0.01367705947859264
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_openaccess-ai-collective__dpopenhermes-alpha-v0 | [
"region:us"
] | 2023-12-04T18:39:31+00:00 | {"pretty_name": "Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/dpopenhermes-alpha-v0](https://huggingface.co/openaccess-ai-collective/dpopenhermes-alpha-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__dpopenhermes-alpha-v0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:36:41.738939](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__dpopenhermes-alpha-v0/blob/main/results_2023-12-04T18-36-41.738939.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6371837798967929,\n \"acc_stderr\": 0.032261818616969314,\n \"acc_norm\": 0.6403189407162136,\n \"acc_norm_stderr\": 0.03290301230310095,\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5174503213318207,\n \"mc2_stderr\": 0.014661651601621145\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790154,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158301\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6346345349531965,\n \"acc_stderr\": 0.004805483767055348,\n \"acc_norm\": 0.839573790081657,\n \"acc_norm_stderr\": 0.003662508272330896\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478923,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478923\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106596,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5174503213318207,\n \"mc2_stderr\": 0.014661651601621145\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.558756633813495,\n \"acc_stderr\": 0.01367705947859264\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/dpopenhermes-alpha-v0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-36-41.738939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["**/details_harness|winogrande|5_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-36-41.738939.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_36_41.738939", "path": ["results_2023-12-04T18-36-41.738939.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-36-41.738939.parquet"]}]}]} | 2023-12-04T18:40:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openaccess-ai-collective/dpopenhermes-alpha-v0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:36:41.738939(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openacce... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluat... | [
6,
28,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/dpopenhermes-alpha-v0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run o... |
78f86ece55611c1e1b45941ff9015512d1a165c4 |
## Descrição do Dataset Description
Dataset com questões do ENEM (Exame Nacional do Ensino Médio) de diversos anos e áreas de conhecimento.
Dataset utilizado para atividades da disciplina de Processamento de Linguagem Natural em 2023/2 no INF/UFG.
## Estrutura do Dataset
- content (contexto da questão)
- prompt (pergunta)
- A (alternativa A)
- B (alternativa B)
- C (alternativa C)
- D (alternativa D)
- E (alternativa E)
- answer (alternativa correta)
### Splits
- Train (1382)
- Validation (276)
- Test (185) | douglasrolins/enem-sample | [
"region:us"
] | 2023-12-04T19:01:14+00:00 | {"dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "E", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1489122, "num_examples": 1382}, {"name": "validation", "num_bytes": 296526, "num_examples": 276}, {"name": "test", "num_bytes": 197526, "num_examples": 185}], "download_size": 1370332, "dataset_size": 1983174}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-04T19:06:58+00:00 | [] | [] | TAGS
#region-us
|
## Descrição do Dataset Description
Dataset com questões do ENEM (Exame Nacional do Ensino Médio) de diversos anos e áreas de conhecimento.
Dataset utilizado para atividades da disciplina de Processamento de Linguagem Natural em 2023/2 no INF/UFG.
## Estrutura do Dataset
- content (contexto da questão)
- prompt (pergunta)
- A (alternativa A)
- B (alternativa B)
- C (alternativa C)
- D (alternativa D)
- E (alternativa E)
- answer (alternativa correta)
### Splits
- Train (1382)
- Validation (276)
- Test (185) | [
"## Descrição do Dataset Description\n\nDataset com questões do ENEM (Exame Nacional do Ensino Médio) de diversos anos e áreas de conhecimento.\n\nDataset utilizado para atividades da disciplina de Processamento de Linguagem Natural em 2023/2 no INF/UFG.",
"## Estrutura do Dataset\n\n- content (contexto da questã... | [
"TAGS\n#region-us \n",
"## Descrição do Dataset Description\n\nDataset com questões do ENEM (Exame Nacional do Ensino Médio) de diversos anos e áreas de conhecimento.\n\nDataset utilizado para atividades da disciplina de Processamento de Linguagem Natural em 2023/2 no INF/UFG.",
"## Estrutura do Dataset\n\n- co... | [
6,
54,
65,
18
] | [
"passage: TAGS\n#region-us \n## Descrição do Dataset Description\n\nDataset com questões do ENEM (Exame Nacional do Ensino Médio) de diversos anos e áreas de conhecimento.\n\nDataset utilizado para atividades da disciplina de Processamento de Linguagem Natural em 2023/2 no INF/UFG.## Estrutura do Dataset\n\n- conte... |
336d63bb470c28dd016d02776453366a347770b4 |
# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mergedlm/zephyrnotus-11b-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mergedlm/zephyrnotus-11b-alpha](https://huggingface.co/mergedlm/zephyrnotus-11b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:58:32.292259](https://huggingface.co/datasets/open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha/blob/main/results_2023-12-04T18-58-32.292259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6022156893041962,
"acc_stderr": 0.03336144237047965,
"acc_norm": 0.6105212360689912,
"acc_norm_stderr": 0.03409373060010593,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5721680885718956,
"mc2_stderr": 0.015636158796667236
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910478
},
"harness|hellaswag|10": {
"acc": 0.6352320254929297,
"acc_stderr": 0.004803812631994954,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.003765898364938865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835772,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835772
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.04161808503501531,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.04161808503501531
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851116,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851116
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422876,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422876
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101077,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101077
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.01262334375743002,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.01262334375743002
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440303,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440303
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768924,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768924
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5721680885718956,
"mc2_stderr": 0.015636158796667236
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
},
"harness|gsm8k|5": {
"acc": 0.17134192570128887,
"acc_stderr": 0.010379150273178357
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha | [
"region:us"
] | 2023-12-04T19:01:27+00:00 | {"pretty_name": "Evaluation run of mergedlm/zephyrnotus-11b-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [mergedlm/zephyrnotus-11b-alpha](https://huggingface.co/mergedlm/zephyrnotus-11b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:58:32.292259](https://huggingface.co/datasets/open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha/blob/main/results_2023-12-04T18-58-32.292259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6022156893041962,\n \"acc_stderr\": 0.03336144237047965,\n \"acc_norm\": 0.6105212360689912,\n \"acc_norm_stderr\": 0.03409373060010593,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5721680885718956,\n \"mc2_stderr\": 0.015636158796667236\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910478\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n \"acc_stderr\": 0.004803812631994954,\n \"acc_norm\": 0.8280223063134834,\n \"acc_norm_stderr\": 0.003765898364938865\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835772,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835772\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.04161808503501531,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.04161808503501531\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851116,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851116\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422876,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422876\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101077,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101077\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n \"acc_stderr\": 0.01262334375743002,\n \"acc_norm\": 0.424380704041721,\n \"acc_norm_stderr\": 0.01262334375743002\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440303,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440303\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5721680885718956,\n \"mc2_stderr\": 0.015636158796667236\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17134192570128887,\n \"acc_stderr\": 0.010379150273178357\n }\n}\n```", "repo_url": "https://huggingface.co/mergedlm/zephyrnotus-11b-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["**/details_harness|winogrande|5_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-58-32.292259.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_58_32.292259", "path": ["results_2023-12-04T18-58-32.292259.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-58-32.292259.parquet"]}]}]} | 2023-12-04T19:02:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mergedlm/zephyrnotus-11b-alpha on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:58:32.292259(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mergedlm/zephyrnotus-11b... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mergedlm... |
04902f7292d4e59aba3187da0dcfdef95975fc6e |
# Dataset Card for Evaluation run of Q-bert/Optimus-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Q-bert/Optimus-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Q-bert/Optimus-7B](https://huggingface.co/Q-bert/Optimus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__Optimus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:59:49.207215](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Optimus-7B/blob/main/results_2023-12-04T18-59-49.207215.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6392766919486776,
"acc_stderr": 0.03229323039450184,
"acc_norm": 0.6401046235645558,
"acc_norm_stderr": 0.032947292086342644,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5578912654610536,
"mc2_stderr": 0.015509983004926231
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.01413117676013117,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145673
},
"harness|hellaswag|10": {
"acc": 0.6683927504481179,
"acc_stderr": 0.004698285350019212,
"acc_norm": 0.8541127265484963,
"acc_norm_stderr": 0.003522717499524299
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391945,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.01639943636661292,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.01639943636661292
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5578912654610536,
"mc2_stderr": 0.015509983004926231
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.01149338468724978
},
"harness|gsm8k|5": {
"acc": 0.6550416982562547,
"acc_stderr": 0.013093630133666235
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Q-bert__Optimus-7B | [
"region:us"
] | 2023-12-04T19:02:47+00:00 | {"pretty_name": "Evaluation run of Q-bert/Optimus-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/Optimus-7B](https://huggingface.co/Q-bert/Optimus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__Optimus-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:59:49.207215](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Optimus-7B/blob/main/results_2023-12-04T18-59-49.207215.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6392766919486776,\n \"acc_stderr\": 0.03229323039450184,\n \"acc_norm\": 0.6401046235645558,\n \"acc_norm_stderr\": 0.032947292086342644,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5578912654610536,\n \"mc2_stderr\": 0.015509983004926231\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.01413117676013117,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145673\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n \"acc_stderr\": 0.004698285350019212,\n \"acc_norm\": 0.8541127265484963,\n \"acc_norm_stderr\": 0.003522717499524299\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391945,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391945\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.01639943636661292,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.01639943636661292\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5578912654610536,\n \"mc2_stderr\": 0.015509983004926231\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.01149338468724978\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6550416982562547,\n \"acc_stderr\": 0.013093630133666235\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/Optimus-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-59-49.207215.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["**/details_harness|winogrande|5_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-59-49.207215.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T18_59_49.207215", "path": ["results_2023-12-04T18-59-49.207215.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-59-49.207215.parquet"]}]}]} | 2023-12-04T19:04:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/Optimus-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Q-bert/Optimus-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T18:59:49.207215(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Q-bert/Optimus-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Optimus-7B on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/Optimus-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Optim... | [
6,
18,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Q-bert/Optimus-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Optimus-7B on ... |
5fa5b832f764eaa830c9138937d5a575350e72a6 |
# Dataset Card for Evaluation run of Q-bert/Bumblebee-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Q-bert/Bumblebee-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Q-bert/Bumblebee-7B](https://huggingface.co/Q-bert/Bumblebee-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__Bumblebee-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:02:58.959245](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Bumblebee-7B/blob/main/results_2023-12-04T19-02-58.959245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6423559079822854,
"acc_stderr": 0.03212728779652208,
"acc_norm": 0.6433195276283713,
"acc_norm_stderr": 0.032776334270389784,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5095643475017586,
"mc2_stderr": 0.015572340523512473
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513778,
"acc_norm": 0.6339590443686007,
"acc_norm_stderr": 0.014077223108470137
},
"harness|hellaswag|10": {
"acc": 0.6554471220872337,
"acc_stderr": 0.0047425103547779025,
"acc_norm": 0.8415654252141008,
"acc_norm_stderr": 0.0036440173837115923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101736,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101736
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947409,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662253,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341767,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341767
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5095643475017586,
"mc2_stderr": 0.015572340523512473
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.6565579984836998,
"acc_stderr": 0.013079933811800304
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Q-bert__Bumblebee-7B | [
"region:us"
] | 2023-12-04T19:05:58+00:00 | {"pretty_name": "Evaluation run of Q-bert/Bumblebee-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/Bumblebee-7B](https://huggingface.co/Q-bert/Bumblebee-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__Bumblebee-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:02:58.959245](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Bumblebee-7B/blob/main/results_2023-12-04T19-02-58.959245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6423559079822854,\n \"acc_stderr\": 0.03212728779652208,\n \"acc_norm\": 0.6433195276283713,\n \"acc_norm_stderr\": 0.032776334270389784,\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5095643475017586,\n \"mc2_stderr\": 0.015572340523512473\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513778,\n \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470137\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6554471220872337,\n \"acc_stderr\": 0.0047425103547779025,\n \"acc_norm\": 0.8415654252141008,\n \"acc_norm_stderr\": 0.0036440173837115923\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101736,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101736\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662253,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.016155910721341767,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.016155910721341767\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5095643475017586,\n \"mc2_stderr\": 0.015572340523512473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \"acc_stderr\": 0.013079933811800304\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/Bumblebee-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-58.959245.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["**/details_harness|winogrande|5_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-02-58.959245.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_02_58.959245", "path": ["results_2023-12-04T19-02-58.959245.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-02-58.959245.parquet"]}]}]} | 2023-12-04T19:06:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/Bumblebee-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Q-bert/Bumblebee-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:02:58.959245(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Q-bert/Bumblebee-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Bumblebee-7B on the Open LLM... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/Bumblebee-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Bum... | [
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Q-bert/Bumblebee-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Q-bert/Bumblebee-7B... |
7b50036b20f141b65097e3c4afaa38f1d7fa186b |
# Dataset Card for Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rufjdk5480/llama-7b-ludwig-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rufjdk5480/llama-7b-ludwig-alpaca](https://huggingface.co/rufjdk5480/llama-7b-ludwig-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rufjdk5480__llama-7b-ludwig-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:02:43.278164](https://huggingface.co/datasets/open-llm-leaderboard/details_rufjdk5480__llama-7b-ludwig-alpaca/blob/main/results_2023-12-04T19-02-43.278164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46060379742069524,
"acc_stderr": 0.03441125676761374,
"acc_norm": 0.46498436629120626,
"acc_norm_stderr": 0.03518871840894695,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4191386468811098,
"mc2_stderr": 0.014271816029328676
},
"harness|arc:challenge|25": {
"acc": 0.5085324232081911,
"acc_stderr": 0.014609263165632182,
"acc_norm": 0.5401023890784983,
"acc_norm_stderr": 0.01456431885692485
},
"harness|hellaswag|10": {
"acc": 0.5903206532563234,
"acc_stderr": 0.004907694727935688,
"acc_norm": 0.7872933678550089,
"acc_norm_stderr": 0.004083855139469325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111394,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.039701582732351734,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.039701582732351734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47419354838709676,
"acc_stderr": 0.028406095057653315,
"acc_norm": 0.47419354838709676,
"acc_norm_stderr": 0.028406095057653315
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970187,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970187
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.03324837939758159,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.03324837939758159
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.025203571773028333,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.025203571773028333
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729565,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353603,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891197,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891197
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6130268199233716,
"acc_stderr": 0.017417138059440132,
"acc_norm": 0.6130268199233716,
"acc_norm_stderr": 0.017417138059440132
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.02784647600593047,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.02784647600593047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611327,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611327
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36571056062581486,
"acc_stderr": 0.012301028188840567,
"acc_norm": 0.36571056062581486,
"acc_norm_stderr": 0.012301028188840567
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4191386468811098,
"mc2_stderr": 0.014271816029328676
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.01228598961886571
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.00979750318052788
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_rufjdk5480__llama-7b-ludwig-alpaca | [
"region:us"
] | 2023-12-04T19:05:58+00:00 | {"pretty_name": "Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [rufjdk5480/llama-7b-ludwig-alpaca](https://huggingface.co/rufjdk5480/llama-7b-ludwig-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rufjdk5480__llama-7b-ludwig-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:02:43.278164](https://huggingface.co/datasets/open-llm-leaderboard/details_rufjdk5480__llama-7b-ludwig-alpaca/blob/main/results_2023-12-04T19-02-43.278164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46060379742069524,\n \"acc_stderr\": 0.03441125676761374,\n \"acc_norm\": 0.46498436629120626,\n \"acc_norm_stderr\": 0.03518871840894695,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4191386468811098,\n \"mc2_stderr\": 0.014271816029328676\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5085324232081911,\n \"acc_stderr\": 0.014609263165632182,\n \"acc_norm\": 0.5401023890784983,\n \"acc_norm_stderr\": 0.01456431885692485\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5903206532563234,\n \"acc_stderr\": 0.004907694727935688,\n \"acc_norm\": 0.7872933678550089,\n \"acc_norm_stderr\": 0.004083855139469325\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111394,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.039701582732351734,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.039701582732351734\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47419354838709676,\n \"acc_stderr\": 0.028406095057653315,\n \"acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.028406095057653315\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5252525252525253,\n \"acc_stderr\": 0.03557806245087314,\n \"acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.03557806245087314\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.03324837939758159,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.03324837939758159\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.025203571773028333,\n \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.025203571773028333\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729565,\n \"acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729565\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.03476099060501636,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.03476099060501636\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6371308016877637,\n \"acc_stderr\": 0.03129920825530213,\n \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.03129920825530213\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.029872577708891197,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.029872577708891197\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6130268199233716,\n \"acc_stderr\": 0.017417138059440132,\n \"acc_norm\": 0.6130268199233716,\n \"acc_norm_stderr\": 0.017417138059440132\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.02784647600593047,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.02784647600593047\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611327,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611327\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36571056062581486,\n \"acc_stderr\": 0.012301028188840567,\n \"acc_norm\": 0.36571056062581486,\n \"acc_norm_stderr\": 0.012301028188840567\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.43300653594771243,\n \"acc_stderr\": 0.020045442473324227,\n \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.020045442473324227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4191386468811098,\n \"mc2_stderr\": 0.014271816029328676\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.01228598961886571\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \"acc_stderr\": 0.00979750318052788\n }\n}\n```", "repo_url": "https://huggingface.co/rufjdk5480/llama-7b-ludwig-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-43.278164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["**/details_harness|winogrande|5_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-02-43.278164.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_02_43.278164", "path": ["results_2023-12-04T19-02-43.278164.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-02-43.278164.parquet"]}]}]} | 2023-12-04T19:06:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model rufjdk5480/llama-7b-ludwig-alpaca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:02:43.278164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rufjdk5480/llama-7b-l... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mo... | [
6,
26,
31,
175,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rufjdk5480/llama-7b-ludwig-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model rufjd... |
87622b7ed76f2cd4ed258db52b1f643c39643ed4 |
# Dataset Card for Evaluation run of KnutJaegersberg/falcon-1b-t-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/falcon-1b-t-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/falcon-1b-t-sft](https://huggingface.co/KnutJaegersberg/falcon-1b-t-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__falcon-1b-t-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:05:57.412781](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__falcon-1b-t-sft/blob/main/results_2023-12-04T19-05-57.412781.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2571482773686455,
"acc_stderr": 0.030827593737624604,
"acc_norm": 0.2593690477702816,
"acc_norm_stderr": 0.03161954080445179,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.38486056709707445,
"mc2_stderr": 0.015385392751923936
},
"harness|arc:challenge|25": {
"acc": 0.2841296928327645,
"acc_stderr": 0.013179442447653887,
"acc_norm": 0.3293515358361775,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.43905596494722166,
"acc_stderr": 0.004952576863315219,
"acc_norm": 0.5723959370643298,
"acc_norm_stderr": 0.004937199759947685
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.0247907845017754,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.0247907845017754
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411426,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411426
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2,
"acc_stderr": 0.026148818018424495,
"acc_norm": 0.2,
"acc_norm_stderr": 0.026148818018424495
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261117,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261117
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276861,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276861
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817258,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882374,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882374
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479661,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479661
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690225,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690225
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083292,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083292
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2645739910313901,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.2645739910313901,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274947,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274947
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.01607312785122125,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.01607312785122125
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3092485549132948,
"acc_stderr": 0.024883140570071755,
"acc_norm": 0.3092485549132948,
"acc_norm_stderr": 0.024883140570071755
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.01131134769063389,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.01131134769063389
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904038,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904038
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.38486056709707445,
"mc2_stderr": 0.015385392751923936
},
"harness|winogrande|5": {
"acc": 0.5588003157063931,
"acc_stderr": 0.013954975072834738
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245449
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__falcon-1b-t-sft | [
"region:us"
] | 2023-12-04T19:08:04+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/falcon-1b-t-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/falcon-1b-t-sft](https://huggingface.co/KnutJaegersberg/falcon-1b-t-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__falcon-1b-t-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:05:57.412781](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__falcon-1b-t-sft/blob/main/results_2023-12-04T19-05-57.412781.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2571482773686455,\n \"acc_stderr\": 0.030827593737624604,\n \"acc_norm\": 0.2593690477702816,\n \"acc_norm_stderr\": 0.03161954080445179,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.38486056709707445,\n \"mc2_stderr\": 0.015385392751923936\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2841296928327645,\n \"acc_stderr\": 0.013179442447653887,\n \"acc_norm\": 0.3293515358361775,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43905596494722166,\n \"acc_stderr\": 0.004952576863315219,\n \"acc_norm\": 0.5723959370643298,\n \"acc_norm_stderr\": 0.004937199759947685\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.0247907845017754,\n \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.0247907845017754\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.031862098516411426,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.031862098516411426\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.026148818018424495,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.026148818018424495\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261117,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261117\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276861,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276861\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817258,\n \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817258\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882374,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882374\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479661,\n \"acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479661\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690225,\n \"acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690225\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083292,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083292\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2645739910313901,\n \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.2645739910313901,\n \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.02891120880274947,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.02891120880274947\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n \"acc_stderr\": 0.01607312785122125,\n \"acc_norm\": 0.280970625798212,\n \"acc_norm_stderr\": 0.01607312785122125\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3092485549132948,\n \"acc_stderr\": 0.024883140570071755,\n \"acc_norm\": 0.3092485549132948,\n \"acc_norm_stderr\": 0.024883140570071755\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n \"acc_stderr\": 0.01131134769063389,\n \"acc_norm\": 0.26792698826597133,\n \"acc_norm_stderr\": 0.01131134769063389\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904038,\n \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904038\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.38486056709707445,\n \"mc2_stderr\": 0.015385392751923936\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5588003157063931,\n \"acc_stderr\": 0.013954975072834738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245449\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/falcon-1b-t-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-05-57.412781.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["**/details_harness|winogrande|5_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-05-57.412781.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_05_57.412781", "path": ["results_2023-12-04T19-05-57.412781.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-05-57.412781.parquet"]}]}]} | 2023-12-04T19:08:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/falcon-1b-t-sft
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/falcon-1b-t-sft on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:05:57.412781(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/falcon-1b-t-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/falcon-... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/falcon-1b-t-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/falcon-1b-t-sft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJae... |
65505e7ed6640e7dc8592e74715b23091e703ce0 |
# Dataset Card for Evaluation run of beomi/Yi-Ko-6B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beomi/Yi-Ko-6B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beomi/Yi-Ko-6B](https://huggingface.co/beomi/Yi-Ko-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beomi__Yi-Ko-6B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:08:16.844680](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__Yi-Ko-6B/blob/main/results_2023-12-04T19-08-16.844680.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5511585458954897,
"acc_stderr": 0.03372606124688932,
"acc_norm": 0.5592593520510756,
"acc_norm_stderr": 0.034493739566800206,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.37094583310302975,
"mc2_stderr": 0.013820249952756731
},
"harness|arc:challenge|25": {
"acc": 0.454778156996587,
"acc_stderr": 0.014551507060836355,
"acc_norm": 0.48890784982935154,
"acc_norm_stderr": 0.01460779491401305
},
"harness|hellaswag|10": {
"acc": 0.5488946425014938,
"acc_stderr": 0.004965866098318173,
"acc_norm": 0.7447719577773352,
"acc_norm_stderr": 0.004350982826580606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04316378599511324,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04316378599511324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796648,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833587,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833587
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716667,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716667
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7177522349936143,
"acc_stderr": 0.016095302969878537,
"acc_norm": 0.7177522349936143,
"acc_norm_stderr": 0.016095302969878537
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016117,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249624,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249624
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.027402042040269966,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.027402042040269966
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289265,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289265
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.37094583310302975,
"mc2_stderr": 0.013820249952756731
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626304
},
"harness|gsm8k|5": {
"acc": 0.12509476876421532,
"acc_stderr": 0.009112601439849625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_beomi__Yi-Ko-6B | [
"region:us"
] | 2023-12-04T19:10:25+00:00 | {"pretty_name": "Evaluation run of beomi/Yi-Ko-6B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beomi/Yi-Ko-6B](https://huggingface.co/beomi/Yi-Ko-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beomi__Yi-Ko-6B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:08:16.844680](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__Yi-Ko-6B/blob/main/results_2023-12-04T19-08-16.844680.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5511585458954897,\n \"acc_stderr\": 0.03372606124688932,\n \"acc_norm\": 0.5592593520510756,\n \"acc_norm_stderr\": 0.034493739566800206,\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.37094583310302975,\n \"mc2_stderr\": 0.013820249952756731\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.454778156996587,\n \"acc_stderr\": 0.014551507060836355,\n \"acc_norm\": 0.48890784982935154,\n \"acc_norm_stderr\": 0.01460779491401305\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5488946425014938,\n \"acc_stderr\": 0.004965866098318173,\n \"acc_norm\": 0.7447719577773352,\n \"acc_norm_stderr\": 0.004350982826580606\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04316378599511324,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04316378599511324\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796648,\n \"acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833587,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.027046857630716667,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.027046857630716667\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7177522349936143,\n \"acc_stderr\": 0.016095302969878537,\n \"acc_norm\": 0.7177522349936143,\n \"acc_norm_stderr\": 0.016095302969878537\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016117,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016117\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249624,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249624\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.027402042040269966,\n \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.027402042040269966\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n \"acc_stderr\": 0.012579699631289265,\n \"acc_norm\": 0.41395045632333766,\n \"acc_norm_stderr\": 0.012579699631289265\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016636,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016636\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969768,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969768\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.37094583310302975,\n \"mc2_stderr\": 0.013820249952756731\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626304\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12509476876421532,\n \"acc_stderr\": 0.009112601439849625\n }\n}\n```", "repo_url": "https://huggingface.co/beomi/Yi-Ko-6B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-08-16.844680.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["**/details_harness|winogrande|5_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-08-16.844680.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_08_16.844680", "path": ["results_2023-12-04T19-08-16.844680.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-08-16.844680.parquet"]}]}]} | 2023-12-04T19:11:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of beomi/Yi-Ko-6B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model beomi/Yi-Ko-6B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:08:16.844680(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of beomi/Yi-Ko-6B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/Yi-Ko-6B on the Open LLM Leaderboa... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beomi/Yi-Ko-6B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/Yi-Ko-6B ... | [
6,
18,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beomi/Yi-Ko-6B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/Yi-Ko-6B on the Op... |
412dcf914d90535fa6c07198eb76c5b1bc7dc026 | # Dataset Card for "twitter-year-splits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | KaiNylund/twitter-year-splits | [
"license:cc0-1.0",
"region:us"
] | 2023-12-04T19:12:03+00:00 | {"license": "cc0-1.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "2015_train", "num_bytes": 208681939, "num_examples": 2170480}, {"name": "2015_test", "num_bytes": 10434355, "num_examples": 108579}, {"name": "2016_train", "num_bytes": 208368447, "num_examples": 2092079}, {"name": "2016_test", "num_bytes": 10418366, "num_examples": 104573}, {"name": "2017_train", "num_bytes": 208041364, "num_examples": 2010333}, {"name": "2017_test", "num_bytes": 10402836, "num_examples": 100694}, {"name": "2018_train", "num_bytes": 207412650, "num_examples": 1853142}, {"name": "2018_test", "num_bytes": 10371011, "num_examples": 92724}, {"name": "2019_train", "num_bytes": 207727161, "num_examples": 1931761}, {"name": "2019_test", "num_bytes": 10386587, "num_examples": 96626}, {"name": "2020_train", "num_bytes": 207828470, "num_examples": 1957103}, {"name": "2020_test", "num_bytes": 10391406, "num_examples": 97842}], "download_size": 1021891477, "dataset_size": 1310464592}} | 2024-02-12T23:26:50+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
| # Dataset Card for "twitter-year-splits"
More Information needed | [
"# Dataset Card for \"twitter-year-splits\"\n\nMore Information needed"
] | [
"TAGS\n#license-cc0-1.0 #region-us \n",
"# Dataset Card for \"twitter-year-splits\"\n\nMore Information needed"
] | [
14,
17
] | [
"passage: TAGS\n#license-cc0-1.0 #region-us \n# Dataset Card for \"twitter-year-splits\"\n\nMore Information needed"
] |
0dc6090bb7c094496fca17f3e0be013b19b0b8bc | # Dataset Card for "arxiv-year-splits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | KaiNylund/arxiv-year-splits | [
"region:us"
] | 2023-12-04T19:16:02+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "2006_2008_train", "num_bytes": 100484371, "num_examples": 120937}, {"name": "2006_2008_test", "num_bytes": 10050474, "num_examples": 12157}, {"name": "2009_2011_train", "num_bytes": 145839572, "num_examples": 157401}, {"name": "2009_2011_test", "num_bytes": 15067693, "num_examples": 16306}, {"name": "2012_2014_train", "num_bytes": 149239610, "num_examples": 153162}, {"name": "2012_2014_test", "num_bytes": 15064105, "num_examples": 15440}, {"name": "2015_2017_train", "num_bytes": 150547411, "num_examples": 136762}, {"name": "2015_2017_test", "num_bytes": 15057851, "num_examples": 13745}, {"name": "2018_2020_train", "num_bytes": 150517629, "num_examples": 129279}, {"name": "2018_2020_test", "num_bytes": 15052957, "num_examples": 12885}], "download_size": 474674602, "dataset_size": 766921673}} | 2023-12-04T19:17:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "arxiv-year-splits"
More Information needed | [
"# Dataset Card for \"arxiv-year-splits\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"arxiv-year-splits\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"arxiv-year-splits\"\n\nMore Information needed"
] |
6752e503bb5e56e4c05a17e02327caf3d67857ca |
# Dataset Card for Evaluation run of rishiraj/smol-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rishiraj/smol-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rishiraj/smol-7b](https://huggingface.co/rishiraj/smol-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rishiraj__smol-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:19:39.463418](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-7b/blob/main/results_2023-12-04T19-19-39.463418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6514323841472758,
"acc_stderr": 0.03191453823895794,
"acc_norm": 0.6531744958038254,
"acc_norm_stderr": 0.032557792933231744,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4617167162027618,
"mc2_stderr": 0.015041171351243195
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.657239593706433,
"acc_stderr": 0.004736621698861175,
"acc_norm": 0.8477394941246763,
"acc_norm_stderr": 0.0035853896364723818
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126255,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503234,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503234
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899712,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015064,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4617167162027618,
"mc2_stderr": 0.015041171351243195
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920524
},
"harness|gsm8k|5": {
"acc": 0.623199393479909,
"acc_stderr": 0.013347858757829158
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_rishiraj__smol-7b | [
"region:us"
] | 2023-12-04T19:22:31+00:00 | {"pretty_name": "Evaluation run of rishiraj/smol-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rishiraj/smol-7b](https://huggingface.co/rishiraj/smol-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__smol-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:19:39.463418](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-7b/blob/main/results_2023-12-04T19-19-39.463418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6514323841472758,\n \"acc_stderr\": 0.03191453823895794,\n \"acc_norm\": 0.6531744958038254,\n \"acc_norm_stderr\": 0.032557792933231744,\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4617167162027618,\n \"mc2_stderr\": 0.015041171351243195\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n \"acc_stderr\": 0.004736621698861175,\n \"acc_norm\": 0.8477394941246763,\n \"acc_norm_stderr\": 0.0035853896364723818\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126255,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503234,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503234\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.01653682964899712,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.01653682964899712\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015064,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015064\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4617167162027618,\n \"mc2_stderr\": 0.015041171351243195\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.623199393479909,\n \"acc_stderr\": 0.013347858757829158\n }\n}\n```", "repo_url": "https://huggingface.co/rishiraj/smol-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-19-39.463418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["**/details_harness|winogrande|5_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-19-39.463418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_19_39.463418", "path": ["results_2023-12-04T19-19-39.463418.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-19-39.463418.parquet"]}]}]} | 2023-12-04T19:23:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rishiraj/smol-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model rishiraj/smol-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:19:39.463418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of rishiraj/smol-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rishiraj/smol-7b on the Open LLM Leade... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rishiraj/smol-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rishiraj/smol... | [
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rishiraj/smol-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model rishiraj/smol-7b on th... |
c24faf0be3c16f9d019c4e03bcbadc36b6b15521 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ItsRedux/CyberSecDat | [
"region:us"
] | 2023-12-04T19:32:46+00:00 | {} | 2023-12-04T19:35:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:"... | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Languag... | [
6,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP... |
d0e07b18f8a736ba9c866c190768462bb9048eea |
# Dataset Card for Evaluation run of meta-math/MetaMath-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-math/MetaMath-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T19:35:59.251082](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B/blob/main/results_2023-12-04T19-35-59.251082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6224817411296446,
"acc_stderr": 0.03262551509185562,
"acc_norm": 0.6227799225969178,
"acc_norm_stderr": 0.033291016555049055,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4489052122445318,
"mc2_stderr": 0.01547532303838066
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.01446763155913799,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693024
},
"harness|hellaswag|10": {
"acc": 0.6437960565624378,
"acc_stderr": 0.004778978031389641,
"acc_norm": 0.8258315076677952,
"acc_norm_stderr": 0.0037847921724660652
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.0252798503974049,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.0252798503974049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.01606229067111046,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.01606229067111046
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851488,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851488
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4489052122445318,
"mc2_stderr": 0.01547532303838066
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174787
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B | [
"region:us"
] | 2023-12-04T19:38:52+00:00 | {"pretty_name": "Evaluation run of meta-math/MetaMath-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T19:35:59.251082](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B/blob/main/results_2023-12-04T19-35-59.251082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6224817411296446,\n \"acc_stderr\": 0.03262551509185562,\n \"acc_norm\": 0.6227799225969178,\n \"acc_norm_stderr\": 0.033291016555049055,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4489052122445318,\n \"mc2_stderr\": 0.01547532303838066\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.01446763155913799,\n \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693024\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6437960565624378,\n \"acc_stderr\": 0.004778978031389641,\n \"acc_norm\": 0.8258315076677952,\n \"acc_norm_stderr\": 0.0037847921724660652\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.0252798503974049,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.0252798503974049\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n \"acc_stderr\": 0.01606229067111046,\n \"acc_norm\": 0.36089385474860336,\n \"acc_norm_stderr\": 0.01606229067111046\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4489052122445318,\n \"mc2_stderr\": 0.01547532303838066\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174787\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \"acc_stderr\": 0.012757375376754941\n }\n}\n```", "repo_url": "https://huggingface.co/meta-math/MetaMath-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T19-35-59.251082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["**/details_harness|winogrande|5_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T19-35-59.251082.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T19_35_59.251082", "path": ["results_2023-12-04T19-35-59.251082.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T19-35-59.251082.parquet"]}]}]} | 2023-12-04T19:39:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of meta-math/MetaMath-Mistral-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model meta-math/MetaMath-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T19:35:59.251082(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of meta-math/MetaMath-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model meta-math/MetaMath-Mistra... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of meta-math/MetaMath-Mistral-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ... | [
6,
21,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of meta-math/MetaMath-Mistral-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model meta-math... |
2b971d9ea2f1075c245cbab045dc44a1e1570bf2 | # Dataset Card for "fashion-mnist-interview"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | daloopa/fashion-mnist-interview | [
"region:us"
] | 2023-12-04T19:52:02+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "T - shirt / top", "1": "Trouser", "2": "Pullover", "3": "Dress", "4": "Coat", "5": "Sandal", "6": "Shirt", "7": "Sneaker", "8": "Bag", "9": "Ankle boot"}}}}], "splits": [{"name": "train", "num_bytes": 31049107.0, "num_examples": 60000}, {"name": "test", "num_bytes": 4150316.0, "num_examples": 8000}], "download_size": 33099036, "dataset_size": 35199423.0}} | 2023-12-04T19:52:08+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fashion-mnist-interview"
More Information needed | [
"# Dataset Card for \"fashion-mnist-interview\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fashion-mnist-interview\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fashion-mnist-interview\"\n\nMore Information needed"
] |
086d77b439d29d4eb817065b5b1d3d2ef2086db9 | # Dataset Card for "fashion-mnist-interview-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | daloopa/fashion-mnist-interview-test | [
"region:us"
] | 2023-12-04T19:53:27+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "test", "num_bytes": 1026244.0, "num_examples": 2000}], "download_size": 982702, "dataset_size": 1026244.0}} | 2023-12-04T21:53:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fashion-mnist-interview-test"
More Information needed | [
"# Dataset Card for \"fashion-mnist-interview-test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fashion-mnist-interview-test\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fashion-mnist-interview-test\"\n\nMore Information needed"
] |
7788c0e0ee38bdd2ab211109da8a9c3117614866 |
# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tlphams/zoyllm-7b-slimorca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tlphams/zoyllm-7b-slimorca](https://huggingface.co/tlphams/zoyllm-7b-slimorca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T20:19:06.813924](https://huggingface.co/datasets/open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca/blob/main/results_2023-12-04T20-19-06.813924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4870010988384523,
"acc_stderr": 0.03455949361884823,
"acc_norm": 0.4920391497879656,
"acc_norm_stderr": 0.03531050289249056,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.4913166366572656,
"mc2_stderr": 0.0160517163595852
},
"harness|arc:challenge|25": {
"acc": 0.4726962457337884,
"acc_stderr": 0.014589589101985993,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.5509858593905597,
"acc_stderr": 0.004963771168672079,
"acc_norm": 0.7211710814578769,
"acc_norm_stderr": 0.004475067344626756
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.03308530426228257,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.03308530426228257
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.034169036403915214,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.034169036403915214
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361356,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.025049197876042338,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.025049197876042338
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.03266478331527272,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.03266478331527272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.648786717752235,
"acc_stderr": 0.017069982051499434,
"acc_norm": 0.648786717752235,
"acc_norm_stderr": 0.017069982051499434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.02684298551961537,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.02684298551961537
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095266,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5337620578778135,
"acc_stderr": 0.028333277109562793,
"acc_norm": 0.5337620578778135,
"acc_norm_stderr": 0.028333277109562793
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35984354628422427,
"acc_stderr": 0.0122582604836898,
"acc_norm": 0.35984354628422427,
"acc_norm_stderr": 0.0122582604836898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.020148939420415738,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.020148939420415738
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.4913166366572656,
"mc2_stderr": 0.0160517163595852
},
"harness|winogrande|5": {
"acc": 0.6732438831886346,
"acc_stderr": 0.013181997302131362
},
"harness|gsm8k|5": {
"acc": 0.20697498104624715,
"acc_stderr": 0.011159498164891766
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca | [
"region:us"
] | 2023-12-04T20:21:57+00:00 | {"pretty_name": "Evaluation run of tlphams/zoyllm-7b-slimorca", "dataset_summary": "Dataset automatically created during the evaluation run of model [tlphams/zoyllm-7b-slimorca](https://huggingface.co/tlphams/zoyllm-7b-slimorca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T20:19:06.813924](https://huggingface.co/datasets/open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca/blob/main/results_2023-12-04T20-19-06.813924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4870010988384523,\n \"acc_stderr\": 0.03455949361884823,\n \"acc_norm\": 0.4920391497879656,\n \"acc_norm_stderr\": 0.03531050289249056,\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4913166366572656,\n \"mc2_stderr\": 0.0160517163595852\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4726962457337884,\n \"acc_stderr\": 0.014589589101985993,\n \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5509858593905597,\n \"acc_stderr\": 0.004963771168672079,\n \"acc_norm\": 0.7211710814578769,\n \"acc_norm_stderr\": 0.004475067344626756\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458,\n \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.03308530426228257,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.03308530426228257\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.034169036403915214,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.034169036403915214\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361356,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361356\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.025049197876042338,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.025049197876042338\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\": 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.03266478331527272,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.03266478331527272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n \"acc_stderr\": 0.017069982051499434,\n \"acc_norm\": 0.648786717752235,\n \"acc_norm_stderr\": 0.017069982051499434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.02684298551961537,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.02684298551961537\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n \"acc_stderr\": 0.014874252168095266,\n \"acc_norm\": 0.27150837988826815,\n \"acc_norm_stderr\": 0.014874252168095266\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n \"acc_stderr\": 0.028333277109562793,\n \"acc_norm\": 0.5337620578778135,\n \"acc_norm_stderr\": 0.028333277109562793\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35984354628422427,\n \"acc_stderr\": 0.0122582604836898,\n \"acc_norm\": 0.35984354628422427,\n \"acc_norm_stderr\": 0.0122582604836898\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016633,\n \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.020148939420415738,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.020148939420415738\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.036871306155620606,\n \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.036871306155620606\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4913166366572656,\n \"mc2_stderr\": 0.0160517163595852\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6732438831886346,\n \"acc_stderr\": 0.013181997302131362\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20697498104624715,\n \"acc_stderr\": 0.011159498164891766\n }\n}\n```", "repo_url": "https://huggingface.co/tlphams/zoyllm-7b-slimorca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["**/details_harness|winogrande|5_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T20-19-06.813924.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T20_19_06.813924", "path": ["results_2023-12-04T20-19-06.813924.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T20-19-06.813924.parquet"]}]}]} | 2023-12-04T20:22:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model tlphams/zoyllm-7b-slimorca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T20:19:06.813924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tlphams/zoyllm-7b-slimorca o... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tlp... | [
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model tlphams/zoyl... |
90bba42b973268514b701001346c230504232ee3 | # Dataset Card for "thestack_python_threading"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | celinelee/thestack_python_threading | [
"region:us"
] | 2023-12-04T20:29:38+00:00 | {"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "python", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3869477993.4417205, "num_examples": 168964}, {"name": "valid", "num_bytes": 483696199.7791398, "num_examples": 21121}, {"name": "test", "num_bytes": 483696199.7791398, "num_examples": 21121}], "download_size": 1685764929, "dataset_size": 4836870393.0}} | 2023-12-04T20:44:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "thestack_python_threading"
More Information needed | [
"# Dataset Card for \"thestack_python_threading\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"thestack_python_threading\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"thestack_python_threading\"\n\nMore Information needed"
] |
14809f94ff8891c009f7a03c71f402f589c36804 |
# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jebcarter/psyonic-cetacean-20B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T20:41:51.584700](https://huggingface.co/datasets/open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B/blob/main/results_2023-12-04T20-41-51.584700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5935200283760108,
"acc_stderr": 0.03289023551450696,
"acc_norm": 0.6017961208576313,
"acc_norm_stderr": 0.03361696714318325,
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5754737295645932,
"mc2_stderr": 0.01561942525764945
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.6356655290102389,
"acc_norm_stderr": 0.014063260279882419
},
"harness|hellaswag|10": {
"acc": 0.6783509261103365,
"acc_stderr": 0.0046615449915830345,
"acc_norm": 0.861979685321649,
"acc_norm_stderr": 0.0034421638433628794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699968,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699968
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110946,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787572,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.032596251184168284,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.032596251184168284
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833688,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833688
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22346368715083798,
"acc_stderr": 0.013932068638579773,
"acc_norm": 0.22346368715083798,
"acc_norm_stderr": 0.013932068638579773
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468497,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889017,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889017
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5754737295645932,
"mc2_stderr": 0.01561942525764945
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.01161619821577323
},
"harness|gsm8k|5": {
"acc": 0.1470811220621683,
"acc_stderr": 0.009756063660359868
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B | [
"region:us"
] | 2023-12-04T20:44:46+00:00 | {"pretty_name": "Evaluation run of jebcarter/psyonic-cetacean-20B", "dataset_summary": "Dataset automatically created during the evaluation run of model [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T20:41:51.584700](https://huggingface.co/datasets/open-llm-leaderboard/details_jebcarter__psyonic-cetacean-20B/blob/main/results_2023-12-04T20-41-51.584700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5935200283760108,\n \"acc_stderr\": 0.03289023551450696,\n \"acc_norm\": 0.6017961208576313,\n \"acc_norm_stderr\": 0.03361696714318325,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5754737295645932,\n \"mc2_stderr\": 0.01561942525764945\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882419\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6783509261103365,\n \"acc_stderr\": 0.0046615449915830345,\n \"acc_norm\": 0.861979685321649,\n \"acc_norm_stderr\": 0.0034421638433628794\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549655,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699968,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699968\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110946,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110946\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787572,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787572\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.032596251184168284,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.032596251184168284\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n \"acc_stderr\": 0.014616099385833688,\n \"acc_norm\": 0.7879948914431673,\n \"acc_norm_stderr\": 0.014616099385833688\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22346368715083798,\n \"acc_stderr\": 0.013932068638579773,\n \"acc_norm\": 0.22346368715083798,\n \"acc_norm_stderr\": 0.013932068638579773\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468497,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468497\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889017,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889017\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573702,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573702\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5754737295645932,\n \"mc2_stderr\": 0.01561942525764945\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \"acc_stderr\": 0.009756063660359868\n }\n}\n```", "repo_url": "https://huggingface.co/jebcarter/psyonic-cetacean-20B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["**/details_harness|winogrande|5_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T20-41-51.584700.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T20_41_51.584700", "path": ["results_2023-12-04T20-41-51.584700.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T20-41-51.584700.parquet"]}]}]} | 2023-12-04T20:45:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jebcarter/psyonic-cetacean-20B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T20:41:51.584700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jebcarter/psyonic-cetace... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jebcarter/psyonic-cetacean-20B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jebcarte... |
e74aff31a09a57375e24fdcfe60f6616620b19ee |
I created these images using a p5.js. I first rendered them in a GIF, uploaded to Google Drive, and then extracted the individual frames
from the GIF.
The light background gears images were generated in p5.js using this [sketch](https://editor.p5js.org/kfahn/sketches/mJ4FOnPy5).
The dark background gears images were generated in p5.js using this [sketch](https://editor.p5js.org/kfahn/sketches/mJ4FOnPy5). | kfahn/3d_gears | [
"license:mit",
"region:us"
] | 2023-12-04T20:54:14+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "dark", "1": "light"}}}}], "splits": [{"name": "train", "num_bytes": 296821064, "num_examples": 4000}], "download_size": 264360785, "dataset_size": 296821064}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-04T21:03:23+00:00 | [] | [] | TAGS
#license-mit #region-us
|
I created these images using a URL. I first rendered them in a GIF, uploaded to Google Drive, and then extracted the individual frames
from the GIF.
The light background gears images were generated in URL using this sketch.
The dark background gears images were generated in URL using this sketch. | [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] |
520f49e689bd35047970db0708b3efb1039d2f7b |
# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-AEZAKMI-v1](https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T22:17:18.926595](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1/blob/main/results_2023-12-04T22-17-18.926595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.733063777345197,
"acc_stderr": 0.02911576095445218,
"acc_norm": 0.7392718490739228,
"acc_norm_stderr": 0.029657906091365063,
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.557340774150812,
"mc2_stderr": 0.015053849366752348
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693024,
"acc_norm": 0.643344709897611,
"acc_norm_stderr": 0.01399805690262019
},
"harness|hellaswag|10": {
"acc": 0.6422027484564827,
"acc_stderr": 0.004783723798286501,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898964017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549912,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549912
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.025757559893106737,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.025757559893106737
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093278,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093278
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7404255319148936,
"acc_stderr": 0.028659179374292326,
"acc_norm": 0.7404255319148936,
"acc_norm_stderr": 0.028659179374292326
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6507936507936508,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.6507936507936508,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781657,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781657
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295136,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295136
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7769230769230769,
"acc_stderr": 0.02110773012724401,
"acc_norm": 0.7769230769230769,
"acc_norm_stderr": 0.02110773012724401
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227638,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227638
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305424,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305424
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481693,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481693
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868837,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868837
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.0283116014414386,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.0283116014414386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881347,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881347
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401854,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401854
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.896551724137931,
"acc_stderr": 0.010890452544691499,
"acc_norm": 0.896551724137931,
"acc_norm_stderr": 0.010890452544691499
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8063583815028902,
"acc_stderr": 0.021274230317515557,
"acc_norm": 0.8063583815028902,
"acc_norm_stderr": 0.021274230317515557
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7027932960893855,
"acc_stderr": 0.0152853133536416,
"acc_norm": 0.7027932960893855,
"acc_norm_stderr": 0.0152853133536416
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880945,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880945
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.02118589361522516,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.02118589361522516
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5814863102998696,
"acc_stderr": 0.012599505608336477,
"acc_norm": 0.5814863102998696,
"acc_norm_stderr": 0.012599505608336477
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.016729937565537558,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.016729937565537558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.023420972069166344,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.023420972069166344
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.02116621630465939,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.02116621630465939
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355024,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.557340774150812,
"mc2_stderr": 0.015053849366752348
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.5291887793783169,
"acc_stderr": 0.013748996794921798
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1 | [
"region:us"
] | 2023-12-04T22:20:07+00:00 | {"pretty_name": "Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-AEZAKMI-v1](https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T22:17:18.926595](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1/blob/main/results_2023-12-04T22-17-18.926595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.733063777345197,\n \"acc_stderr\": 0.02911576095445218,\n \"acc_norm\": 0.7392718490739228,\n \"acc_norm_stderr\": 0.029657906091365063,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.557340774150812,\n \"mc2_stderr\": 0.015053849366752348\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693024,\n \"acc_norm\": 0.643344709897611,\n \"acc_norm_stderr\": 0.01399805690262019\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6422027484564827,\n \"acc_stderr\": 0.004783723798286501,\n \"acc_norm\": 0.8430591515634336,\n \"acc_norm_stderr\": 0.0036300159898964017\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.025757559893106737,\n \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.025757559893106737\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093278,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093278\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7404255319148936,\n \"acc_stderr\": 0.028659179374292326,\n \"acc_norm\": 0.7404255319148936,\n \"acc_norm_stderr\": 0.028659179374292326\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6507936507936508,\n \"acc_stderr\": 0.024552292209342658,\n \"acc_norm\": 0.6507936507936508,\n \"acc_norm_stderr\": 0.024552292209342658\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295136,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295136\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7769230769230769,\n \"acc_stderr\": 0.02110773012724401,\n \"acc_norm\": 0.7769230769230769,\n \"acc_norm_stderr\": 0.02110773012724401\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227638,\n \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227638\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305424,\n \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305424\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.423841059602649,\n \"acc_stderr\": 0.04034846678603396,\n \"acc_norm\": 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603396\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481693,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481693\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n \"acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868837,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868837\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.0283116014414386,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.0283116014414386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.6517857142857143,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881347,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881347\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401854,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401854\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.896551724137931,\n \"acc_stderr\": 0.010890452544691499,\n \"acc_norm\": 0.896551724137931,\n \"acc_norm_stderr\": 0.010890452544691499\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8063583815028902,\n \"acc_stderr\": 0.021274230317515557,\n \"acc_norm\": 0.8063583815028902,\n \"acc_norm_stderr\": 0.021274230317515557\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7027932960893855,\n \"acc_stderr\": 0.0152853133536416,\n \"acc_norm\": 0.7027932960893855,\n \"acc_norm_stderr\": 0.0152853133536416\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880945,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880945\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.02118589361522516,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.02118589361522516\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5921985815602837,\n \"acc_stderr\": 0.029316011776343555,\n \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.029316011776343555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5814863102998696,\n \"acc_stderr\": 0.012599505608336477,\n \"acc_norm\": 0.5814863102998696,\n \"acc_norm_stderr\": 0.012599505608336477\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.016729937565537558,\n \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.016729937565537558\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.023420972069166344,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.023420972069166344\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355024,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355024\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.557340774150812,\n \"mc2_stderr\": 0.015053849366752348\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5291887793783169,\n \"acc_stderr\": 0.013748996794921798\n }\n}\n```", "repo_url": "https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|arc:challenge|25_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|gsm8k|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hellaswag|10_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["**/details_harness|winogrande|5_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T22-17-18.926595.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T22_17_18.926595", "path": ["results_2023-12-04T22-17-18.926595.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T22-17-18.926595.parquet"]}]}]} | 2023-12-04T22:20:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model adamo1139/Yi-34B-AEZAKMI-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T22:17:18.926595(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model adamo1139/Yi-34B-AEZAKMI-v1... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ad... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of adamo1139/Yi-34B-AEZAKMI-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model adamo1139/Y... |
28f5bf3e2fa6637980a200943393cdb81a490d08 |
Another clone of openbmb/UltraFeedback, with all completions by 'bard', 'gpt-3.5-turbo', or 'gpt-4' removed prior to binarization.
The annotations are still written by GPT4, so this dataset is neither OpenAI-free nor commercially-available.
If you're looking for an open-source DPO dataset, you may want to try nvidia/HelpSteer for the time being. | monology/ultrafeedback-liberated | [
"license:apache-2.0",
"region:us"
] | 2023-12-04T22:41:37+00:00 | {"license": "apache-2.0"} | 2023-12-04T23:28:02+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Another clone of openbmb/UltraFeedback, with all completions by 'bard', 'gpt-3.5-turbo', or 'gpt-4' removed prior to binarization.
The annotations are still written by GPT4, so this dataset is neither OpenAI-free nor commercially-available.
If you're looking for an open-source DPO dataset, you may want to try nvidia/HelpSteer for the time being. | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
4a6a98dfadc7fb79a5f1ee4cc3db8e3e5a9fec54 | # Dataset Card for "fm-updates-falcon-instruct-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | coastalcph/fm-updates-falcon-instruct-7b | [
"region:us"
] | 2023-12-04T22:48:56+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "query", "struct": [{"name": "label", "dtype": "string"}, {"name": "objects", "list": [{"name": "aliases", "sequence": "string"}, {"name": "label", "dtype": "string"}, {"name": "qid", "dtype": "string"}]}, {"name": "qid", "dtype": "string"}, {"name": "rel_id", "dtype": "string"}, {"name": "relation", "dtype": "string"}]}, {"name": "prediction", "struct": [{"name": "predictions", "list": [{"name": "answer", "dtype": "string"}, {"name": "first_token_probability", "dtype": "float64"}, {"name": "per_token_probability", "sequence": "float64"}, {"name": "perplexity", "dtype": "float64"}]}, {"name": "query", "dtype": "string"}]}, {"name": "f1", "dtype": "float64"}, {"name": "relation", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "original_answer", "dtype": "string"}, {"name": "updates", "sequence": "string"}], "splits": [{"name": "test", "num_bytes": 694312.6861702128, "num_examples": 1749}], "download_size": 383499, "dataset_size": 694312.6861702128}} | 2023-12-04T22:49:01+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fm-updates-falcon-instruct-7b"
More Information needed | [
"# Dataset Card for \"fm-updates-falcon-instruct-7b\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fm-updates-falcon-instruct-7b\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fm-updates-falcon-instruct-7b\"\n\nMore Information needed"
] |
f79a7f415e15f851bf0169da9322640d6ae5bc6e |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Q
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-Q
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Q](https://huggingface.co/kyujinpy/PlatYi-34B-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T22:52:59.529862](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Q/blob/main/results_2023-12-04T22-52-59.529862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7691265720086405,
"acc_stderr": 0.027746564509879067,
"acc_norm": 0.7760516739566624,
"acc_norm_stderr": 0.028247188531903868,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5302838395915566,
"mc2_stderr": 0.014898239428144871
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817825
},
"harness|hellaswag|10": {
"acc": 0.654052977494523,
"acc_stderr": 0.004747038768172525,
"acc_norm": 0.8514240191196972,
"acc_norm_stderr": 0.0035494312479073674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930384,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.0339175032232166,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.0339175032232166
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.026754391348039766,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.026754391348039766
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8206896551724138,
"acc_stderr": 0.03196766433373187,
"acc_norm": 0.8206896551724138,
"acc_norm_stderr": 0.03196766433373187
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7354497354497355,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.7354497354497355,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.01630657064448832,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.01630657064448832
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.01349265975129514,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.01349265975129514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8,
"acc_stderr": 0.020280805062535726,
"acc_norm": 0.8,
"acc_norm_stderr": 0.020280805062535726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.03043196354793659,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.03043196354793659
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.02186325849485212,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.02186325849485212
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5496688741721855,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.5496688741721855,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334877,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334877
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089674,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.017676679991891625,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.017676679991891625
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.023952688836676752,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.023952688836676752
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.025212327210507108,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.025212327210507108
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.02650144078476276,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.02650144078476276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466143,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466143
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141849,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442265,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442265
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7441340782122905,
"acc_stderr": 0.014593620923210739,
"acc_norm": 0.7441340782122905,
"acc_norm_stderr": 0.014593620923210739
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043693,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391894,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790906,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790906
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6702127659574468,
"acc_stderr": 0.028045946942042405,
"acc_norm": 0.6702127659574468,
"acc_norm_stderr": 0.028045946942042405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6245110821382008,
"acc_stderr": 0.012367945396728202,
"acc_norm": 0.6245110821382008,
"acc_norm_stderr": 0.012367945396728202
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.02216146260806852,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.02216146260806852
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.01564306991127334,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.01564306991127334
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916633,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5302838395915566,
"mc2_stderr": 0.014898239428144871
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.01068417922770619
},
"harness|gsm8k|5": {
"acc": 0.5398028809704322,
"acc_stderr": 0.013728776714099365
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Q | [
"region:us"
] | 2023-12-04T22:55:49+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-Q", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Q](https://huggingface.co/kyujinpy/PlatYi-34B-Q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Q\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T22:52:59.529862](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Q/blob/main/results_2023-12-04T22-52-59.529862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7691265720086405,\n \"acc_stderr\": 0.027746564509879067,\n \"acc_norm\": 0.7760516739566624,\n \"acc_norm_stderr\": 0.028247188531903868,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5302838395915566,\n \"mc2_stderr\": 0.014898239428144871\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817825\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.654052977494523,\n \"acc_stderr\": 0.004747038768172525,\n \"acc_norm\": 0.8514240191196972,\n \"acc_norm_stderr\": 0.0035494312479073674\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930384,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.0339175032232166,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.0339175032232166\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.026754391348039766,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.026754391348039766\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8206896551724138,\n \"acc_stderr\": 0.03196766433373187,\n \"acc_norm\": 0.8206896551724138,\n \"acc_norm_stderr\": 0.03196766433373187\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7354497354497355,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.7354497354497355,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.01630657064448832,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.01630657064448832\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.01349265975129514,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.01349265975129514\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535726,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.020280805062535726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.03043196354793659,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.03043196354793659\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.02186325849485212,\n \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.02186325849485212\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5496688741721855,\n \"acc_stderr\": 0.04062290018683775,\n \"acc_norm\": 0.5496688741721855,\n \"acc_norm_stderr\": 0.04062290018683775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334877,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334877\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.03167468706828979,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.03167468706828979\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089674,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891625,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891625\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9256198347107438,\n \"acc_stderr\": 0.023952688836676752,\n \"acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.023952688836676752\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507108,\n \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507108\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.6517857142857143,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.02650144078476276,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.02650144078476276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466143,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466143\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141849,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442265,\n \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442265\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7441340782122905,\n \"acc_stderr\": 0.014593620923210739,\n \"acc_norm\": 0.7441340782122905,\n \"acc_norm_stderr\": 0.014593620923210739\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043693,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n \"acc_stderr\": 0.020862388082391894,\n \"acc_norm\": 0.8392282958199357,\n \"acc_norm_stderr\": 0.020862388082391894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6702127659574468,\n \"acc_stderr\": 0.028045946942042405,\n \"acc_norm\": 0.6702127659574468,\n \"acc_norm_stderr\": 0.028045946942042405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6245110821382008,\n \"acc_stderr\": 0.012367945396728202,\n \"acc_norm\": 0.6245110821382008,\n \"acc_norm_stderr\": 0.012367945396728202\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.02216146260806852,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.02216146260806852\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.01564306991127334,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.01564306991127334\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916633,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5302838395915566,\n \"mc2_stderr\": 0.014898239428144871\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.01068417922770619\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5398028809704322,\n \"acc_stderr\": 0.013728776714099365\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-Q", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|arc:challenge|25_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|gsm8k|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hellaswag|10_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T22-52-59.529862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["**/details_harness|winogrande|5_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T22-52-59.529862.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T22_52_59.529862", "path": ["results_2023-12-04T22-52-59.529862.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T22-52-59.529862.parquet"]}]}]} | 2023-12-04T22:56:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Q
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Q on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T22:52:59.529862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-Q on the Open... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Q",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy... | [
6,
21,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Q## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-3... |
48259648c202d82469341ae6070f60b77cbfc6b3 |
## This is a TriviaQA wikipedia dataset that was reformated and "answer_start" added
This dataset has context max tokens length of 5000.
I used this dataset for my research, you can find code for reformatting TriviaQA here:
https://github.com/Kkordik/NovelQSI
| Kkordik/TriviaQA_SQuAD | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-04T23:01:50+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"]} | 2023-12-05T06:46:32+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
## This is a TriviaQA wikipedia dataset that was reformated and "answer_start" added
This dataset has context max tokens length of 5000.
I used this dataset for my research, you can find code for reformatting TriviaQA here:
URL
| [
"## This is a TriviaQA wikipedia dataset that was reformated and \"answer_start\" added\n\nThis dataset has context max tokens length of 5000.\n\nI used this dataset for my research, you can find code for reformatting TriviaQA here:\n\nURL"
] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"## This is a TriviaQA wikipedia dataset that was reformated and \"answer_start\" added\n\nThis dataset has context max tokens length of 5000.\n\nI used this dataset for my research, you ca... | [
42,
57
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n## This is a TriviaQA wikipedia dataset that was reformated and \"answer_start\" added\n\nThis dataset has context max tokens length of 5000.\n\nI used this dataset for my research, you... |
aee17b9ecfcb4624d93b4b28e1396873196671d1 | # Dataset Card for "cityscape_3_classes_offset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Arham-Imran/cityscape_3_classes_offset | [
"region:us"
] | 2023-12-04T23:08:37+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 6824060744.525, "num_examples": 2975}, {"name": "val", "num_bytes": 1185871140.0, "num_examples": 500}], "download_size": 3207188951, "dataset_size": 8009931884.525}} | 2023-12-05T09:45:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cityscape_3_classes_offset"
More Information needed | [
"# Dataset Card for \"cityscape_3_classes_offset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cityscape_3_classes_offset\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"cityscape_3_classes_offset\"\n\nMore Information needed"
] |
2ddf3079272f07005d856d3eae4b0aefd36d21fd |
# Dataset Card for Evaluation run of migtissera/Tess-M-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Tess-M-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Tess-M-v1.3](https://huggingface.co/migtissera/Tess-M-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-M-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T23:32:51.712332](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-M-v1.3/blob/main/results_2023-12-04T23-32-51.712332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.747566002523043,
"acc_stderr": 0.028749261755203245,
"acc_norm": 0.7529037953743296,
"acc_norm_stderr": 0.029285728391357593,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559638,
"mc2": 0.5603469779031626,
"mc2_stderr": 0.015661408014010857
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449705,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893456
},
"harness|hellaswag|10": {
"acc": 0.6494722166899024,
"acc_stderr": 0.004761601303258892,
"acc_norm": 0.8394742083250348,
"acc_norm_stderr": 0.0036634275361781586
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.022691482872035353,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.022691482872035353
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848062,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848062
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.03456425745086999,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.03456425745086999
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6640211640211641,
"acc_stderr": 0.02432631052914915,
"acc_norm": 0.6640211640211641,
"acc_norm_stderr": 0.02432631052914915
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657248,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657248
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.02098480861004794,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.02098480861004794
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.01349265975129514,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.01349265975129514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8025641025641026,
"acc_stderr": 0.02018264696867483,
"acc_norm": 0.8025641025641026,
"acc_norm_stderr": 0.02018264696867483
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.02977384701253297,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.02977384701253297
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334889,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334889
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9362745098039216,
"acc_stderr": 0.01714392165552496,
"acc_norm": 0.9362745098039216,
"acc_norm_stderr": 0.01714392165552496
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804468,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804468
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292847,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292847
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.021393961404363847,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.021393961404363847
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6871508379888268,
"acc_stderr": 0.015506892594647258,
"acc_norm": 0.6871508379888268,
"acc_norm_stderr": 0.015506892594647258
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.021339479988816024,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.021339479988816024
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.021823422857744943,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.021823422857744943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6312056737588653,
"acc_stderr": 0.02878222756134726,
"acc_norm": 0.6312056737588653,
"acc_norm_stderr": 0.02878222756134726
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5984354628422425,
"acc_stderr": 0.01252031512014712,
"acc_norm": 0.5984354628422425,
"acc_norm_stderr": 0.01252031512014712
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581795,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581795
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.015588643495370463,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.015588643495370463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.02520696315422539,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.02520696315422539
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.0261682213446623,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.0261682213446623
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559638,
"mc2": 0.5603469779031626,
"mc2_stderr": 0.015661408014010857
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019799
},
"harness|gsm8k|5": {
"acc": 0.5921152388172858,
"acc_stderr": 0.013536742075643088
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_migtissera__Tess-M-v1.3 | [
"region:us"
] | 2023-12-04T23:35:40+00:00 | {"pretty_name": "Evaluation run of migtissera/Tess-M-v1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-M-v1.3](https://huggingface.co/migtissera/Tess-M-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-M-v1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T23:32:51.712332](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-M-v1.3/blob/main/results_2023-12-04T23-32-51.712332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.747566002523043,\n \"acc_stderr\": 0.028749261755203245,\n \"acc_norm\": 0.7529037953743296,\n \"acc_norm_stderr\": 0.029285728391357593,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559638,\n \"mc2\": 0.5603469779031626,\n \"mc2_stderr\": 0.015661408014010857\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449705,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893456\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6494722166899024,\n \"acc_stderr\": 0.004761601303258892,\n \"acc_norm\": 0.8394742083250348,\n \"acc_norm_stderr\": 0.0036634275361781586\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.022691482872035353,\n \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.022691482872035353\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848062,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848062\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496228,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496228\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6640211640211641,\n \"acc_stderr\": 0.02432631052914915,\n \"acc_norm\": 0.6640211640211641,\n \"acc_norm_stderr\": 0.02432631052914915\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657248,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657248\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.02098480861004794,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.02098480861004794\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.01349265975129514,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.01349265975129514\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8025641025641026,\n \"acc_stderr\": 0.02018264696867483,\n \"acc_norm\": 0.8025641025641026,\n \"acc_norm_stderr\": 0.02018264696867483\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.02977384701253297,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.02977384701253297\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334889,\n \"acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334889\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804468,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804468\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292847,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292847\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363847,\n \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363847\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6871508379888268,\n \"acc_stderr\": 0.015506892594647258,\n \"acc_norm\": 0.6871508379888268,\n \"acc_norm_stderr\": 0.015506892594647258\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.021339479988816024,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.021339479988816024\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.021823422857744943,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.021823422857744943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6312056737588653,\n \"acc_stderr\": 0.02878222756134726,\n \"acc_norm\": 0.6312056737588653,\n \"acc_norm_stderr\": 0.02878222756134726\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5984354628422425,\n \"acc_stderr\": 0.01252031512014712,\n \"acc_norm\": 0.5984354628422425,\n \"acc_norm_stderr\": 0.01252031512014712\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581795,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581795\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370463,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.02520696315422539,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.02520696315422539\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.0261682213446623,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.0261682213446623\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559638,\n \"mc2\": 0.5603469779031626,\n \"mc2_stderr\": 0.015661408014010857\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019799\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \"acc_stderr\": 0.013536742075643088\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-M-v1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|arc:challenge|25_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|gsm8k|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hellaswag|10_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T23-32-51.712332.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["**/details_harness|winogrande|5_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T23-32-51.712332.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T23_32_51.712332", "path": ["results_2023-12-04T23-32-51.712332.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T23-32-51.712332.parquet"]}]}]} | 2023-12-04T23:36:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of migtissera/Tess-M-v1.3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model migtissera/Tess-M-v1.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T23:32:51.712332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of migtissera/Tess-M-v1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-M-v1.3 on the Op... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of migtissera/Tess-M-v1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtiss... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Tess-M-v1.3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-... |
6f512972fe3a5890afe987cd9cb3f5b51d77872d |
# Instruction-Following Evaluation Dataset
## 📜 Overview
This dataset, specifically designed for the **evaluation of large language models in instruction-following tasks**, is directly inspired by the methodologies and experiments described in the paper titled _"Instruction-Following Evaluation for Large Language Models"_. The dataset's creation and availability on HuggingFace are aimed at enhancing research and application in the field of natural language understanding, particularly in the context of instruction interpretation and execution by AI models.
## 🌐 Source
The dataset draws its structure and content from the insights provided in:
- **Original Research Paper**: [_"Instruction-Following Evaluation for Large Language Models"_](https://arxiv.org/abs/2311.07911)
- **Original Data Repository**: [Google Research on GitHub](https://github.com/google-research/google-research/tree/master/instruction_following_eval)
## 📊 Dataset Structure
Comprising primarily of **'prompts'**, this dataset is tailored to challenge and assess language models on various facets of understanding and executing instructions. Each prompt represents a unique scenario or task, simulating real-world applications where accurate interpretation of instructions is crucial.
## 💡 Usage
Targeted for use within the **HuggingFace ecosystem**, this dataset serves as a pivotal tool for researchers and developers focusing on the advancement of language models. It stands as a benchmark for:
- 📈 Evaluating model performance in instruction-following tasks.
- 🔍 Identifying model capabilities and areas of improvement.
- 🤖 Enhancing AI's understanding of complex, human-like commands.
## 🙏 Acknowledgements
This dataset is a tribute to the foundational work presented in the original paper and is intended for academic and research purposes. It reflects a commitment to furthering the understanding of AI's interaction with human language, particularly in processing and responding to diverse and complex instructions.
| harpreetsahota/Instruction-Following-Evaluation-for-Large-Language-Models | [
"arxiv:2311.07911",
"region:us"
] | 2023-12-04T23:42:12+00:00 | {"dataset_info": {"features": [{"name": "key", "dtype": "int64"}, {"name": "prompt", "dtype": "string"}, {"name": "instruction_id_list", "sequence": "string"}, {"name": "kwargs", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 181824, "num_examples": 541}], "download_size": 80840, "dataset_size": 181824}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-16T01:06:32+00:00 | [
"2311.07911"
] | [] | TAGS
#arxiv-2311.07911 #region-us
|
# Instruction-Following Evaluation Dataset
## Overview
This dataset, specifically designed for the evaluation of large language models in instruction-following tasks, is directly inspired by the methodologies and experiments described in the paper titled _"Instruction-Following Evaluation for Large Language Models"_. The dataset's creation and availability on HuggingFace are aimed at enhancing research and application in the field of natural language understanding, particularly in the context of instruction interpretation and execution by AI models.
## Source
The dataset draws its structure and content from the insights provided in:
- Original Research Paper: _"Instruction-Following Evaluation for Large Language Models"_
- Original Data Repository: Google Research on GitHub
## Dataset Structure
Comprising primarily of 'prompts', this dataset is tailored to challenge and assess language models on various facets of understanding and executing instructions. Each prompt represents a unique scenario or task, simulating real-world applications where accurate interpretation of instructions is crucial.
## Usage
Targeted for use within the HuggingFace ecosystem, this dataset serves as a pivotal tool for researchers and developers focusing on the advancement of language models. It stands as a benchmark for:
- Evaluating model performance in instruction-following tasks.
- Identifying model capabilities and areas of improvement.
- Enhancing AI's understanding of complex, human-like commands.
## Acknowledgements
This dataset is a tribute to the foundational work presented in the original paper and is intended for academic and research purposes. It reflects a commitment to furthering the understanding of AI's interaction with human language, particularly in processing and responding to diverse and complex instructions.
| [
"# Instruction-Following Evaluation Dataset",
"## Overview\n\nThis dataset, specifically designed for the evaluation of large language models in instruction-following tasks, is directly inspired by the methodologies and experiments described in the paper titled _\"Instruction-Following Evaluation for Large Langu... | [
"TAGS\n#arxiv-2311.07911 #region-us \n",
"# Instruction-Following Evaluation Dataset",
"## Overview\n\nThis dataset, specifically designed for the evaluation of large language models in instruction-following tasks, is directly inspired by the methodologies and experiments described in the paper titled _\"Instr... | [
15,
11,
109,
53,
66,
92,
65
] | [
"passage: TAGS\n#arxiv-2311.07911 #region-us \n# Instruction-Following Evaluation Dataset## Overview\n\nThis dataset, specifically designed for the evaluation of large language models in instruction-following tasks, is directly inspired by the methodologies and experiments described in the paper titled _\"Instruct... |
2e3aff5b8d58eade73a6b3760088399ebecfc034 |
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/APMIC/caigun-lora-model-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-33B](https://huggingface.co/APMIC/caigun-lora-model-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_APMIC__caigun-lora-model-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T00:06:51.823733](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-33B/blob/main/results_2023-12-05T00-06-51.823733.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_APMIC__caigun-lora-model-33B | [
"region:us"
] | 2023-12-05T00:09:10+00:00 | {"pretty_name": "Evaluation run of APMIC/caigun-lora-model-33B", "dataset_summary": "Dataset automatically created during the evaluation run of model [APMIC/caigun-lora-model-33B](https://huggingface.co/APMIC/caigun-lora-model-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_APMIC__caigun-lora-model-33B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T00:06:51.823733](https://huggingface.co/datasets/open-llm-leaderboard/details_APMIC__caigun-lora-model-33B/blob/main/results_2023-12-05T00-06-51.823733.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/APMIC/caigun-lora-model-33B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|arc:challenge|25_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|gsm8k|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hellaswag|10_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T00-06-51.823733.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["**/details_harness|winogrande|5_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T00-06-51.823733.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T00_06_51.823733", "path": ["results_2023-12-05T00-06-51.823733.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T00-06-51.823733.parquet"]}]}]} | 2023-12-05T00:09:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of APMIC/caigun-lora-model-33B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model APMIC/caigun-lora-model-33B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T00:06:51.823733(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model APMIC/caigun-lora-model-33B... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of APMIC/caigun-lora-model-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AP... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of APMIC/caigun-lora-model-33B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model APMIC/caigu... |
46ccda446ca0f4cf303653d9022928487e047657 | # Dataset Card for "INFO-desc-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | fightfei/INFO-desc-llama2 | [
"region:us"
] | 2023-12-05T00:09:48+00:00 | {"dataset_info": {"features": [{"name": "Subject Code", "dtype": "string"}, {"name": "Subject number", "dtype": "int64"}, {"name": "Unnamed: 2", "dtype": "string"}, {"name": "Hours", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2394.0, "num_examples": 36}, {"name": "test", "num_bytes": 266.0, "num_examples": 4}], "download_size": 6214, "dataset_size": 2660.0}} | 2023-12-05T00:11:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "INFO-desc-llama2"
More Information needed | [
"# Dataset Card for \"INFO-desc-llama2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"INFO-desc-llama2\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"INFO-desc-llama2\"\n\nMore Information needed"
] |
44282dea81be1fefa3bfd791e5ee686060b6aa28 | # Dataset Card for "fintuned-llm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mhzarem76/fintuned-llm | [
"region:us"
] | 2023-12-05T00:46:14+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18997844, "num_examples": 51942}], "download_size": 11986973, "dataset_size": 18997844}} | 2023-12-05T00:46:16+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fintuned-llm"
More Information needed | [
"# Dataset Card for \"fintuned-llm\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fintuned-llm\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fintuned-llm\"\n\nMore Information needed"
] |
d87e35b04df2db9ee020930f90cab1d18822d2b9 |
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-LoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-LoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-LoRA](https://huggingface.co/kyujinpy/PlatYi-34B-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-LoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T00:53:53.251371](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-LoRA/blob/main/results_2023-12-05T00-53-53.251371.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.774827191806162,
"acc_stderr": 0.027570240573755966,
"acc_norm": 0.7838406442171528,
"acc_norm_stderr": 0.028074245588194685,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5332425206331443,
"mc2_stderr": 0.014839740435159312
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620189,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537297
},
"harness|hellaswag|10": {
"acc": 0.6567416849233221,
"acc_stderr": 0.00473826494473715,
"acc_norm": 0.8537143995220076,
"acc_norm_stderr": 0.003526700741879443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7555555555555555,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.7555555555555555,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8137931034482758,
"acc_stderr": 0.03243946159004616,
"acc_norm": 0.8137931034482758,
"acc_norm_stderr": 0.03243946159004616
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7380952380952381,
"acc_stderr": 0.02264421261552521,
"acc_norm": 0.7380952380952381,
"acc_norm_stderr": 0.02264421261552521
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9161290322580645,
"acc_stderr": 0.015769027496775667,
"acc_norm": 0.9161290322580645,
"acc_norm_stderr": 0.015769027496775667
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706473,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706473
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8256410256410256,
"acc_stderr": 0.01923724980340523,
"acc_norm": 0.8256410256410256,
"acc_norm_stderr": 0.01923724980340523
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.03044452852881074,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.03044452852881074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8781512605042017,
"acc_stderr": 0.021248144538412016,
"acc_norm": 0.8781512605042017,
"acc_norm_stderr": 0.021248144538412016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281733,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.011051255247815462,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.011051255247815462
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7175925925925926,
"acc_stderr": 0.030701372111510927,
"acc_norm": 0.7175925925925926,
"acc_norm_stderr": 0.030701372111510927
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9282700421940928,
"acc_stderr": 0.01679698961111959,
"acc_norm": 0.9282700421940928,
"acc_norm_stderr": 0.01679698961111959
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455385,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455385
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6696428571428571,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.6696428571428571,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778513,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778513
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7687150837988826,
"acc_stderr": 0.014102223623152586,
"acc_norm": 0.7687150837988826,
"acc_norm_stderr": 0.014102223623152586
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.01909486481386516,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.01909486481386516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.02086238808239191,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.02086238808239191
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.018105414094329676,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.018105414094329676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.027807990141320186,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.027807990141320186
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6232073011734028,
"acc_stderr": 0.012376459593894405,
"acc_norm": 0.6232073011734028,
"acc_norm_stderr": 0.012376459593894405
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.014586690876223224,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.014586690876223224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5332425206331443,
"mc2_stderr": 0.014839740435159312
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273763
},
"harness|gsm8k|5": {
"acc": 0.40636846095526913,
"acc_stderr": 0.013528846685413242
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__PlatYi-34B-LoRA | [
"region:us"
] | 2023-12-05T00:56:39+00:00 | {"pretty_name": "Evaluation run of kyujinpy/PlatYi-34B-LoRA", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-LoRA](https://huggingface.co/kyujinpy/PlatYi-34B-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-LoRA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T00:53:53.251371](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-LoRA/blob/main/results_2023-12-05T00-53-53.251371.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.774827191806162,\n \"acc_stderr\": 0.027570240573755966,\n \"acc_norm\": 0.7838406442171528,\n \"acc_norm_stderr\": 0.028074245588194685,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5332425206331443,\n \"mc2_stderr\": 0.014839740435159312\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620189,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537297\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6567416849233221,\n \"acc_stderr\": 0.00473826494473715,\n \"acc_norm\": 0.8537143995220076,\n \"acc_norm_stderr\": 0.003526700741879443\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8137931034482758,\n \"acc_stderr\": 0.03243946159004616,\n \"acc_norm\": 0.8137931034482758,\n \"acc_norm_stderr\": 0.03243946159004616\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7380952380952381,\n \"acc_stderr\": 0.02264421261552521,\n \"acc_norm\": 0.7380952380952381,\n \"acc_norm_stderr\": 0.02264421261552521\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9161290322580645,\n \"acc_stderr\": 0.015769027496775667,\n \"acc_norm\": 0.9161290322580645,\n \"acc_norm_stderr\": 0.015769027496775667\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6699507389162561,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.6699507389162561,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706473,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706473\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8256410256410256,\n \"acc_stderr\": 0.01923724980340523,\n \"acc_norm\": 0.8256410256410256,\n \"acc_norm_stderr\": 0.01923724980340523\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.03044452852881074,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.03044452852881074\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8781512605042017,\n \"acc_stderr\": 0.021248144538412016,\n \"acc_norm\": 0.8781512605042017,\n \"acc_norm_stderr\": 0.021248144538412016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281733,\n \"acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815462,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815462\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7175925925925926,\n \"acc_stderr\": 0.030701372111510927,\n \"acc_norm\": 0.7175925925925926,\n \"acc_norm_stderr\": 0.030701372111510927\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9282700421940928,\n \"acc_stderr\": 0.01679698961111959,\n \"acc_norm\": 0.9282700421940928,\n \"acc_norm_stderr\": 0.01679698961111959\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6696428571428571,\n \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.6696428571428571,\n \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778513,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778513\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7687150837988826,\n \"acc_stderr\": 0.014102223623152586,\n \"acc_norm\": 0.7687150837988826,\n \"acc_norm_stderr\": 0.014102223623152586\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.01909486481386516,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.01909486481386516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n \"acc_stderr\": 0.02086238808239191,\n \"acc_norm\": 0.8392282958199357,\n \"acc_norm_stderr\": 0.02086238808239191\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.018105414094329676,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.018105414094329676\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.027807990141320186,\n \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.027807990141320186\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6232073011734028,\n \"acc_stderr\": 0.012376459593894405,\n \"acc_norm\": 0.6232073011734028,\n \"acc_norm_stderr\": 0.012376459593894405\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.014586690876223224,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.014586690876223224\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5332425206331443,\n \"mc2_stderr\": 0.014839740435159312\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40636846095526913,\n \"acc_stderr\": 0.013528846685413242\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/PlatYi-34B-LoRA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|arc:challenge|25_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|gsm8k|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hellaswag|10_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T00-53-53.251371.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["**/details_harness|winogrande|5_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T00-53-53.251371.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T00_53_53.251371", "path": ["results_2023-12-05T00-53-53.251371.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T00-53-53.251371.parquet"]}]}]} | 2023-12-05T00:57:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-LoRA
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-LoRA on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T00:53:53.251371(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-LoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatYi-34B-LoRA on th... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-LoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyuji... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-LoRA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kyujinpy/PlatY... |
0cc7adc5d5eb8e2651a2d69767cbd59ac57c4b56 | # Dataset Card for "dafny-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | metareflection/dafny-train | [
"region:us"
] | 2023-12-05T00:59:36+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "content", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 122357884, "num_examples": 5619}], "download_size": 16684709, "dataset_size": 122357884}} | 2023-12-05T00:59:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dafny-train"
More Information needed | [
"# Dataset Card for \"dafny-train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dafny-train\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dafny-train\"\n\nMore Information needed"
] |
b770218214a63a7ffb28bf47e400df57650df2b0 | # Dataset Card for "fintuned-llm2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mhzarem76/fintuned-llm2 | [
"region:us"
] | 2023-12-05T01:02:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18997844, "num_examples": 51942}], "download_size": 11986973, "dataset_size": 18997844}} | 2023-12-05T01:02:56+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fintuned-llm2"
More Information needed | [
"# Dataset Card for \"fintuned-llm2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fintuned-llm2\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fintuned-llm2\"\n\nMore Information needed"
] |
9bf70f95d828c70a2ee1856352c7edbb1ee19b64 | # Dataset Card for "kor_ai2_arc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@article{allenai:arc,
author = {Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and
Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord},
title = {Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge},
journal = {arXiv:1803.05457v1},
year = {2018},
}
``` | KETI-AIR/kor_ai2_arc | [
"license:cc-by-sa-4.0",
"region:us"
] | 2023-12-05T01:35:37+00:00 | {"license": "cc-by-sa-4.0", "configs": [{"config_name": "ARC-Challenge", "data_files": [{"split": "train", "path": "ARC-Challenge/train-*"}, {"split": "validation", "path": "ARC-Challenge/validation-*"}, {"split": "test", "path": "ARC-Challenge/test-*"}]}, {"config_name": "ARC-Easy", "data_files": [{"split": "train", "path": "ARC-Easy/train-*"}, {"split": "validation", "path": "ARC-Easy/validation-*"}, {"split": "test", "path": "ARC-Easy/test-*"}]}, {"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": [{"config_name": "ARC-Challenge", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 396164, "num_examples": 1119}, {"name": "validation", "num_bytes": 108314, "num_examples": 299}, {"name": "test", "num_bytes": 425252, "num_examples": 1172}], "download_size": 516331, "dataset_size": 929730}, {"config_name": "ARC-Easy", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 694289, "num_examples": 2251}, {"name": "validation", "num_bytes": 175983, "num_examples": 570}, {"name": "test", "num_bytes": 735067, "num_examples": 2376}], "download_size": 861121, "dataset_size": 1605339}, {"config_name": "default", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 694289, "num_examples": 2251}, {"name": "validation", "num_bytes": 175983, "num_examples": 570}, {"name": "test", "num_bytes": 735067, "num_examples": 2376}], "download_size": 861121, "dataset_size": 1605339}]} | 2023-12-05T02:37:22+00:00 | [] | [] | TAGS
#license-cc-by-sa-4.0 #region-us
| # Dataset Card for "kor_ai2_arc"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_ai2_arc\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-cc-by-sa-4.0 #region-us \n",
"# Dataset Card for \"kor_ai2_arc\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
17,
17,
6
] | [
"passage: TAGS\n#license-cc-by-sa-4.0 #region-us \n# Dataset Card for \"kor_ai2_arc\"\n\nMore Information needed# Source Data Citation Information"
] |
e94e035c3c3efe7564f3386bc77a9d2d36f53644 |
# Dataset Card for Evaluation run of ajibawa-2023/Python-Code-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ajibawa-2023/Python-Code-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ajibawa-2023/Python-Code-33B](https://huggingface.co/ajibawa-2023/Python-Code-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T01:45:33.454054](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B/blob/main/results_2023-12-05T01-45-33.454054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5411850719173311,
"acc_stderr": 0.03390619726316288,
"acc_norm": 0.5470787854450224,
"acc_norm_stderr": 0.034649290725190234,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.443943398739872,
"mc2_stderr": 0.01568143022823914
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714698,
"acc_norm": 0.5631399317406144,
"acc_norm_stderr": 0.014494421584256527
},
"harness|hellaswag|10": {
"acc": 0.622087233618801,
"acc_stderr": 0.004838747305783349,
"acc_norm": 0.8100975901214897,
"acc_norm_stderr": 0.003914221738689083
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045105,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211212,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211212
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845693,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845693
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.019028486711115435,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.019028486711115435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.03149328104507956,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.03149328104507956
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922754,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7330779054916986,
"acc_stderr": 0.015818450894777552,
"acc_norm": 0.7330779054916986,
"acc_norm_stderr": 0.015818450894777552
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.016094338768474596,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.016094338768474596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.02856869975222588,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.02856869975222588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581996,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581996
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.02720111766692565,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.02720111766692565
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940992,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940992
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935729,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935729
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.031464657128274245,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.031464657128274245
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.03056767593891672,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.03056767593891672
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.443943398739872,
"mc2_stderr": 0.01568143022823914
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865346
},
"harness|gsm8k|5": {
"acc": 0.19181197877179681,
"acc_stderr": 0.010845169955294024
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B | [
"region:us"
] | 2023-12-05T01:47:51+00:00 | {"pretty_name": "Evaluation run of ajibawa-2023/Python-Code-33B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ajibawa-2023/Python-Code-33B](https://huggingface.co/ajibawa-2023/Python-Code-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T01:45:33.454054](https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B/blob/main/results_2023-12-05T01-45-33.454054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5411850719173311,\n \"acc_stderr\": 0.03390619726316288,\n \"acc_norm\": 0.5470787854450224,\n \"acc_norm_stderr\": 0.034649290725190234,\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.443943398739872,\n \"mc2_stderr\": 0.01568143022823914\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.014494421584256527\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.622087233618801,\n \"acc_stderr\": 0.004838747305783349,\n \"acc_norm\": 0.8100975901214897,\n \"acc_norm_stderr\": 0.003914221738689083\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.0307235352490061,\n \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.0307235352490061\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.038073017265045105,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.038073017265045105\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211212,\n \"acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211212\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845693,\n \"acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845693\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115435,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115435\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.03149328104507956,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.03149328104507956\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n \"acc_stderr\": 0.026655699653922754,\n \"acc_norm\": 0.7905982905982906,\n \"acc_norm_stderr\": 0.026655699653922754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7330779054916986,\n \"acc_stderr\": 0.015818450894777552,\n \"acc_norm\": 0.7330779054916986,\n \"acc_norm_stderr\": 0.015818450894777552\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.02856869975222588,\n \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.02856869975222588\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.027604689028581996,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.027604689028581996\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.02720111766692565,\n \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.02720111766692565\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940992,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940992\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.020217030653186457,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.020217030653186457\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.031464657128274245,\n \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.031464657128274245\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.03056767593891672,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.03056767593891672\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.443943398739872,\n \"mc2_stderr\": 0.01568143022823914\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865346\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \"acc_stderr\": 0.010845169955294024\n }\n}\n```", "repo_url": "https://huggingface.co/ajibawa-2023/Python-Code-33B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|arc:challenge|25_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|gsm8k|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hellaswag|10_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T01-45-33.454054.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["**/details_harness|winogrande|5_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T01-45-33.454054.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T01_45_33.454054", "path": ["results_2023-12-05T01-45-33.454054.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T01-45-33.454054.parquet"]}]}]} | 2023-12-05T01:48:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ajibawa-2023/Python-Code-33B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ajibawa-2023/Python-Code-33B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T01:45:33.454054(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of ajibawa-2023/Python-Code-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ajibawa-2023/Python-Code-3... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ajibawa-2023/Python-Code-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model a... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ajibawa-2023/Python-Code-33B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ajibawa-20... |
130efdbaf6c6f8a8e519ee210405ee764692ddb8 |
The dataset comprises aerial imagery of Dubai acquired by MBRSC satellites and annotated with pixel-level semantic segmentation across 6 distinct classes. The dataset comprises a total of 72 images, which are organised into 6 larger tiles. The categories are as follows:
Credit: Humans in the Loop is releasing an openly accessible dataset that has been annotated for a collaborative project with the Mohammed Bin Rashid Space Centre in Dubai, United Arab Emirates.
<a href="http://projectcentersinchennai.co.in/Final-Year-Projects-for-CSE/Final-Year-Projects-for-CSE-Deep-learning-Domain" title="Deep Learning Projects for Final Year">Deep Learning Projects for Final Year</a> | gymprathap/Semantic-Segmentation-Aerial-Imagery-Dataset | [
"task_categories:feature-extraction",
"size_categories:n<1K",
"language:en",
"license:cc-by-4.0",
"climate",
"region:us"
] | 2023-12-05T02:11:37+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["feature-extraction"], "pretty_name": "Semantic segmentation of aerial imagery", "tags": ["climate"]} | 2023-12-05T02:19:34+00:00 | [] | [
"en"
] | TAGS
#task_categories-feature-extraction #size_categories-n<1K #language-English #license-cc-by-4.0 #climate #region-us
|
The dataset comprises aerial imagery of Dubai acquired by MBRSC satellites and annotated with pixel-level semantic segmentation across 6 distinct classes. The dataset comprises a total of 72 images, which are organised into 6 larger tiles. The categories are as follows:
Credit: Humans in the Loop is releasing an openly accessible dataset that has been annotated for a collaborative project with the Mohammed Bin Rashid Space Centre in Dubai, United Arab Emirates.
<a href="URL title="Deep Learning Projects for Final Year">Deep Learning Projects for Final Year</a> | [] | [
"TAGS\n#task_categories-feature-extraction #size_categories-n<1K #language-English #license-cc-by-4.0 #climate #region-us \n"
] | [
45
] | [
"passage: TAGS\n#task_categories-feature-extraction #size_categories-n<1K #language-English #license-cc-by-4.0 #climate #region-us \n"
] |
8bc89c0e523b645e6933cb0b107d9276c6ca2cb2 | # Dataset Card for "kor_commonsense_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{talmor-etal-2019-commonsenseqa,
title = "{C}ommonsense{QA}: A Question Answering Challenge Targeting Commonsense Knowledge",
author = "Talmor, Alon and
Herzig, Jonathan and
Lourie, Nicholas and
Berant, Jonathan",
booktitle = "Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)",
month = jun,
year = "2019",
address = "Minneapolis, Minnesota",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N19-1421",
doi = "10.18653/v1/N19-1421",
pages = "4149--4158",
archivePrefix = "arXiv",
eprint = "1811.00937",
primaryClass = "cs",
}
``` | KETI-AIR/kor_commonsense_qa | [
"license:mit",
"region:us"
] | 2023-12-05T02:17:36+00:00 | {"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_concept", "dtype": "string"}, {"name": "choices", "struct": [{"name": "text", "sequence": "string"}, {"name": "label", "sequence": "string"}]}, {"name": "answerKey", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2642161, "num_examples": 9741}, {"name": "validation", "num_bytes": 327694, "num_examples": 1221}, {"name": "test", "num_bytes": 309213, "num_examples": 1140}], "download_size": 1782280, "dataset_size": 3279068}} | 2023-12-05T02:36:54+00:00 | [] | [] | TAGS
#license-mit #region-us
| # Dataset Card for "kor_commonsense_qa"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_commonsense_qa\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-mit #region-us \n",
"# Dataset Card for \"kor_commonsense_qa\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
11,
17,
6
] | [
"passage: TAGS\n#license-mit #region-us \n# Dataset Card for \"kor_commonsense_qa\"\n\nMore Information needed# Source Data Citation Information"
] |
a21f12834657cc0a2badd311003dab2597c0b124 | # Dataset Card for "librispeech960-wavlm-large-km1000_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cmu-mlsp/librispeech960-wavlm-large-km1000_asr | [
"region:us"
] | 2023-12-05T02:19:25+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation_other", "path": "data/validation_other-*"}, {"split": "test_other", "path": "data/test_other-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "audio_codes", "sequence": "string"}, {"name": "id", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1246247156, "num_examples": 281241}, {"name": "validation", "num_bytes": 7052458, "num_examples": 2703}, {"name": "test", "num_bytes": 7062964, "num_examples": 2620}, {"name": "validation_other", "num_bytes": 6706447, "num_examples": 2864}, {"name": "test_other", "num_bytes": 6987808, "num_examples": 2939}], "download_size": 254541270, "dataset_size": 1274056833}} | 2023-12-05T02:20:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "librispeech960-wavlm-large-km1000_asr"
More Information needed | [
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr\"\n\nMore Information needed"
] | [
6,
27
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"librispeech960-wavlm-large-km1000_asr\"\n\nMore Information needed"
] |
202b971078b1b1be719bab211bfdc7b9848bed71 | This dataset is used to adapt [DeepLabCut](https://www.mackenziemathislab.org/deeplabcut) for Human motion tracking.
### Structure of the dataset
- `videos` contains 100+ videos of 4 candidates recorded during a game of darts.
- `labeled-data` contains labels on the corresponding frames of the videos. These labels are used to adapt DeepLabCut for human motion tracking. Under `labeled-data` there are 2 folders for every video.
- `video_name` has all the relevant frames extracted from the video, xy coordinates of the labels in the csv file and the corresponding h5 file.
- `video_name`_labeled has the overlay of the labels for the frames in `video_name`. | pratikshapai/human-motion-tracking-deeplabcut | [
"region:us"
] | 2023-12-05T02:52:32+00:00 | {} | 2024-01-18T02:14:42+00:00 | [] | [] | TAGS
#region-us
| This dataset is used to adapt DeepLabCut for Human motion tracking.
### Structure of the dataset
- 'videos' contains 100+ videos of 4 candidates recorded during a game of darts.
- 'labeled-data' contains labels on the corresponding frames of the videos. These labels are used to adapt DeepLabCut for human motion tracking. Under 'labeled-data' there are 2 folders for every video.
- 'video_name' has all the relevant frames extracted from the video, xy coordinates of the labels in the csv file and the corresponding h5 file.
- 'video_name'_labeled has the overlay of the labels for the frames in 'video_name'. | [
"### Structure of the dataset\n- 'videos' contains 100+ videos of 4 candidates recorded during a game of darts.\n- 'labeled-data' contains labels on the corresponding frames of the videos. These labels are used to adapt DeepLabCut for human motion tracking. Under 'labeled-data' there are 2 folders for every video.\... | [
"TAGS\n#region-us \n",
"### Structure of the dataset\n- 'videos' contains 100+ videos of 4 candidates recorded during a game of darts.\n- 'labeled-data' contains labels on the corresponding frames of the videos. These labels are used to adapt DeepLabCut for human motion tracking. Under 'labeled-data' there are 2 ... | [
6,
153
] | [
"passage: TAGS\n#region-us \n### Structure of the dataset\n- 'videos' contains 100+ videos of 4 candidates recorded during a game of darts.\n- 'labeled-data' contains labels on the corresponding frames of the videos. These labels are used to adapt DeepLabCut for human motion tracking. Under 'labeled-data' there are... |
a3c887e7f0c63f82551874a7354582323b7ac519 |
# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T03:16:54.690977](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties/blob/main/results_2023-12-05T03-16-54.690977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7567711901753588,
"acc_stderr": 0.028382267920122734,
"acc_norm": 0.7615616815437645,
"acc_norm_stderr": 0.028914131489708655,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5583921075323958,
"mc2_stderr": 0.015750345067611658
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6693885680143398,
"acc_stderr": 0.004694718918225748,
"acc_norm": 0.8591913961362279,
"acc_norm_stderr": 0.0034711315448920457
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.02310839379984132,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.02310839379984132
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848076,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848076
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496224,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496224
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637303,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637303
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227627,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445784,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445784
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.010203017847688303,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.010203017847688303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.020776761102512992,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.020776761102512992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7094972067039106,
"acc_stderr": 0.015183844307206165,
"acc_norm": 0.7094972067039106,
"acc_norm_stderr": 0.015183844307206165
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213505,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.02853865002887863,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.02853865002887863
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5873533246414603,
"acc_stderr": 0.012573836633799022,
"acc_norm": 0.5873533246414603,
"acc_norm_stderr": 0.012573836633799022
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02388688192244033,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02388688192244033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108566,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108566
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8530612244897959,
"acc_stderr": 0.022665400417217638,
"acc_norm": 0.8530612244897959,
"acc_norm_stderr": 0.022665400417217638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.020190670535027908,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.020190670535027908
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5583921075323958,
"mc2_stderr": 0.015750345067611658
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363698
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties | [
"region:us"
] | 2023-12-05T03:19:43+00:00 | {"pretty_name": "Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T03:16:54.690977](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties/blob/main/results_2023-12-05T03-16-54.690977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7567711901753588,\n \"acc_stderr\": 0.028382267920122734,\n \"acc_norm\": 0.7615616815437645,\n \"acc_norm_stderr\": 0.028914131489708655,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5583921075323958,\n \"mc2_stderr\": 0.015750345067611658\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6693885680143398,\n \"acc_stderr\": 0.004694718918225748,\n \"acc_norm\": 0.8591913961362279,\n \"acc_norm_stderr\": 0.0034711315448920457\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.02310839379984132,\n \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.02310839379984132\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848076,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848076\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496224,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496224\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972592,\n \"acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972592\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637303,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637303\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227627,\n \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227627\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n \"acc_stderr\": 0.010203017847688303,\n \"acc_norm\": 0.9106002554278416,\n \"acc_norm_stderr\": 0.010203017847688303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.020776761102512992,\n \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.020776761102512992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7094972067039106,\n \"acc_stderr\": 0.015183844307206165,\n \"acc_norm\": 0.7094972067039106,\n \"acc_norm_stderr\": 0.015183844307206165\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213505,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213505\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6453900709219859,\n \"acc_stderr\": 0.02853865002887863,\n \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.02853865002887863\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5873533246414603,\n \"acc_stderr\": 0.012573836633799022,\n \"acc_norm\": 0.5873533246414603,\n \"acc_norm_stderr\": 0.012573836633799022\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02388688192244033,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02388688192244033\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108566,\n \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108566\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8530612244897959,\n \"acc_stderr\": 0.022665400417217638,\n \"acc_norm\": 0.8530612244897959,\n \"acc_norm_stderr\": 0.022665400417217638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n \"acc_stderr\": 0.020190670535027908,\n \"acc_norm\": 0.9104477611940298,\n \"acc_norm_stderr\": 0.020190670535027908\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5583921075323958,\n \"mc2_stderr\": 0.015750345067611658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363698\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \"acc_stderr\": 0.013373971277729817\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["**/details_harness|winogrande|5_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T03-16-54.690977.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T03_16_54.690977", "path": ["results_2023-12-05T03-16-54.690977.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T03-16-54.690977.parquet"]}]}]} | 2023-12-05T03:20:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T03:16:54.690977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model brucet... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evalu... | [
6,
34,
31,
183,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run... |
0582076c7901fc28b6e10f61ae47686e4426d13d |
# Pexels 400k
Dataset of 400,476 videos, their thumbnails, viewcounts, <s>explicit classification,</s> and caption.
Note: The Pexels-320k dataset in the repo is this dataset, with videos <10s removed.
| jovianzm/Pexels-400k | [
"task_categories:image-to-text",
"task_categories:text-to-image",
"task_categories:text-to-video",
"task_categories:image-to-video",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"region:us"
] | 2023-12-05T03:35:57+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["image-to-text", "text-to-image", "text-to-video", "image-to-video"], "pretty_name": "Pexels-400k"} | 2024-01-15T19:04:50+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-to-text #task_categories-text-to-image #task_categories-text-to-video #task_categories-image-to-video #size_categories-100K<n<1M #language-English #license-mit #region-us
|
# Pexels 400k
Dataset of 400,476 videos, their thumbnails, viewcounts, <s>explicit classification,</s> and caption.
Note: The Pexels-320k dataset in the repo is this dataset, with videos <10s removed.
| [
"# Pexels 400k\n\nDataset of 400,476 videos, their thumbnails, viewcounts, <s>explicit classification,</s> and caption.\n\nNote: The Pexels-320k dataset in the repo is this dataset, with videos <10s removed."
] | [
"TAGS\n#task_categories-image-to-text #task_categories-text-to-image #task_categories-text-to-video #task_categories-image-to-video #size_categories-100K<n<1M #language-English #license-mit #region-us \n",
"# Pexels 400k\n\nDataset of 400,476 videos, their thumbnails, viewcounts, <s>explicit classification,</s> a... | [
75,
59
] | [
"passage: TAGS\n#task_categories-image-to-text #task_categories-text-to-image #task_categories-text-to-video #task_categories-image-to-video #size_categories-100K<n<1M #language-English #license-mit #region-us \n# Pexels 400k\n\nDataset of 400,476 videos, their thumbnails, viewcounts, <s>explicit classification,</s... |
ddd27b58da8d587fd3b8a3450a379f2fce03bc9d | This dataset is a subset of the Open Assistant dataset, which you can find here: https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main
This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples.
This dataset was used to train Guanaco with QLoRA.
For further information, please see the original dataset.
License: Apache 2.0 | JayMaier/assistant_test | [
"region:us"
] | 2023-12-05T03:38:08+00:00 | {} | 2023-12-05T05:04:37+00:00 | [] | [] | TAGS
#region-us
| This dataset is a subset of the Open Assistant dataset, which you can find here: URL
This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples.
This dataset was used to train Guanaco with QLoRA.
For further information, please see the original dataset.
License: Apache 2.0 | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
cadc7e631598d6b0631754cb167cfbac6aee1416 |
# Dataset Card for Evaluation run of 01-ai/Yi-34B-200K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/01-ai/Yi-34B-200K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [01-ai/Yi-34B-200K](https://huggingface.co/01-ai/Yi-34B-200K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_01-ai__Yi-34B-200K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T03:41:41.478096](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-200K/blob/main/results_2023-12-05T03-41-41.478096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7553618929104267,
"acc_stderr": 0.02837585903729335,
"acc_norm": 0.7603811984841083,
"acc_norm_stderr": 0.028905075105130153,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5364445120598228,
"mc2_stderr": 0.014804162952722544
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063227
},
"harness|hellaswag|10": {
"acc": 0.6557458673571002,
"acc_stderr": 0.004741534106470288,
"acc_norm": 0.8558056164110734,
"acc_norm_stderr": 0.0035056879433872927
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.02350873921884694,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.02350873921884694
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.033687629322594316,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.033687629322594316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6322751322751323,
"acc_stderr": 0.02483383982556242,
"acc_norm": 0.6322751322751323,
"acc_norm_stderr": 0.02483383982556242
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.01754510295165663,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.01754510295165663
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909039,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909039
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8051282051282052,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.8051282051282052,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082114,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660077,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769574,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804468,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804468
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.02521232721050711,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.02521232721050711
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311357,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311357
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.01052403107905584,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.01052403107905584
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575277,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6558659217877095,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.6558659217877095,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.869281045751634,
"acc_stderr": 0.01930187362421527,
"acc_norm": 0.869281045751634,
"acc_norm_stderr": 0.01930187362421527
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571853,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5990873533246415,
"acc_stderr": 0.012516960350640814,
"acc_norm": 0.5990873533246415,
"acc_norm_stderr": 0.012516960350640814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.015588643495370463,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.015588643495370463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007646,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007646
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.0201906705350279,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.0201906705350279
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5364445120598228,
"mc2_stderr": 0.014804162952722544
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498438
},
"harness|gsm8k|5": {
"acc": 0.6163760424564063,
"acc_stderr": 0.013394238584938161
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_01-ai__Yi-34B-200K | [
"region:us"
] | 2023-12-05T03:44:28+00:00 | {"pretty_name": "Evaluation run of 01-ai/Yi-34B-200K", "dataset_summary": "Dataset automatically created during the evaluation run of model [01-ai/Yi-34B-200K](https://huggingface.co/01-ai/Yi-34B-200K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_01-ai__Yi-34B-200K\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T03:41:41.478096](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-200K/blob/main/results_2023-12-05T03-41-41.478096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7553618929104267,\n \"acc_stderr\": 0.02837585903729335,\n \"acc_norm\": 0.7603811984841083,\n \"acc_norm_stderr\": 0.028905075105130153,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5364445120598228,\n \"mc2_stderr\": 0.014804162952722544\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6557458673571002,\n \"acc_stderr\": 0.004741534106470288,\n \"acc_norm\": 0.8558056164110734,\n \"acc_norm_stderr\": 0.0035056879433872927\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.02350873921884694,\n \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.02350873921884694\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.02767845257821239,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.02767845257821239\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6322751322751323,\n \"acc_stderr\": 0.02483383982556242,\n \"acc_norm\": 0.6322751322751323,\n \"acc_norm_stderr\": 0.02483383982556242\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909039,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909039\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082114,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660077,\n \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660077\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769574,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769574\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293648,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293648\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804468,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804468\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.02521232721050711,\n \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.02521232721050711\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.01052403107905584,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.01052403107905584\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575277,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575277\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6558659217877095,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.6558659217877095,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.869281045751634,\n \"acc_stderr\": 0.01930187362421527,\n \"acc_norm\": 0.869281045751634,\n \"acc_norm_stderr\": 0.01930187362421527\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571853,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6099290780141844,\n \"acc_stderr\": 0.02909767559946393,\n \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.02909767559946393\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5990873533246415,\n \"acc_stderr\": 0.012516960350640814,\n \"acc_norm\": 0.5990873533246415,\n \"acc_norm_stderr\": 0.012516960350640814\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370463,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007646,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007646\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n \"acc_stderr\": 0.0201906705350279,\n \"acc_norm\": 0.9104477611940298,\n \"acc_norm_stderr\": 0.0201906705350279\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5364445120598228,\n \"mc2_stderr\": 0.014804162952722544\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6163760424564063,\n \"acc_stderr\": 0.013394238584938161\n }\n}\n```", "repo_url": "https://huggingface.co/01-ai/Yi-34B-200K", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-41-41.478096.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["**/details_harness|winogrande|5_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T03-41-41.478096.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T03_41_41.478096", "path": ["results_2023-12-05T03-41-41.478096.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T03-41-41.478096.parquet"]}]}]} | 2023-12-05T03:45:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of 01-ai/Yi-34B-200K
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model 01-ai/Yi-34B-200K on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T03:41:41.478096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of 01-ai/Yi-34B-200K",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model 01-ai/Yi-34B-200K on the Open LLM Lea... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of 01-ai/Yi-34B-200K",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model 01-ai/Yi-34B... | [
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of 01-ai/Yi-34B-200K## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model 01-ai/Yi-34B-200K on ... |
75097f42b5e159c2aadf2f35202fb4ec0da9edef |
# Dataset Card for Evaluation run of migtissera/Tess-M-Creative-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Tess-M-Creative-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Tess-M-Creative-v1.0](https://huggingface.co/migtissera/Tess-M-Creative-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-M-Creative-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T03:45:38.672992](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-M-Creative-v1.0/blob/main/results_2023-12-05T03-45-38.672992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7506953369656723,
"acc_stderr": 0.028559826064592703,
"acc_norm": 0.755544561120704,
"acc_norm_stderr": 0.029096967565438774,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5768450076180885,
"mc2_stderr": 0.014925146586405758
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.014084133118104296,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.01376098820088053
},
"harness|hellaswag|10": {
"acc": 0.6496713802031467,
"acc_stderr": 0.004760978203023324,
"acc_norm": 0.8514240191196972,
"acc_norm_stderr": 0.003549431247907371
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062246,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062246
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848062,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848062
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6931216931216931,
"acc_stderr": 0.02375292871211214,
"acc_norm": 0.6931216931216931,
"acc_norm_stderr": 0.02375292871211214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6847290640394089,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.6847290640394089,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.019348070174396985,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.019348070174396985
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476668,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9284403669724771,
"acc_stderr": 0.011051255247815453,
"acc_norm": 0.9284403669724771,
"acc_norm_stderr": 0.011051255247815453
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.0314570385430625,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.0314570385430625
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575284,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6972067039106146,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.6972067039106146,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213516,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438293,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.02894733885161409,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.02894733885161409
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5951760104302477,
"acc_stderr": 0.012536743830953979,
"acc_norm": 0.5951760104302477,
"acc_norm_stderr": 0.012536743830953979
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113004,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113004
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5768450076180885,
"mc2_stderr": 0.014925146586405758
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.01052998141183891
},
"harness|gsm8k|5": {
"acc": 0.6209249431387415,
"acc_stderr": 0.013363630295088356
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_migtissera__Tess-M-Creative-v1.0 | [
"region:us"
] | 2023-12-05T03:48:25+00:00 | {"pretty_name": "Evaluation run of migtissera/Tess-M-Creative-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-M-Creative-v1.0](https://huggingface.co/migtissera/Tess-M-Creative-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-M-Creative-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T03:45:38.672992](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-M-Creative-v1.0/blob/main/results_2023-12-05T03-45-38.672992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7506953369656723,\n \"acc_stderr\": 0.028559826064592703,\n \"acc_norm\": 0.755544561120704,\n \"acc_norm_stderr\": 0.029096967565438774,\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5768450076180885,\n \"mc2_stderr\": 0.014925146586405758\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104296,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.01376098820088053\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6496713802031467,\n \"acc_stderr\": 0.004760978203023324,\n \"acc_norm\": 0.8514240191196972,\n \"acc_norm_stderr\": 0.003549431247907371\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062246,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062246\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848062,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848062\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438015,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438015\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6931216931216931,\n \"acc_stderr\": 0.02375292871211214,\n \"acc_norm\": 0.6931216931216931,\n \"acc_norm_stderr\": 0.02375292871211214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6847290640394089,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.6847290640394089,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396985,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396985\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476668,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.011051255247815453,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.011051255247815453\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.0314570385430625,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.0314570385430625\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575284,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6972067039106146,\n \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.6972067039106146,\n \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213516,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438293,\n \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.02894733885161409,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.02894733885161409\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5951760104302477,\n \"acc_stderr\": 0.012536743830953979,\n \"acc_norm\": 0.5951760104302477,\n \"acc_norm_stderr\": 0.012536743830953979\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113004,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113004\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916635,\n \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916635\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5768450076180885,\n \"mc2_stderr\": 0.014925146586405758\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.01052998141183891\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \"acc_stderr\": 0.013363630295088356\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-M-Creative-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T03-45-38.672992.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["**/details_harness|winogrande|5_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T03-45-38.672992.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T03_45_38.672992", "path": ["results_2023-12-05T03-45-38.672992.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T03-45-38.672992.parquet"]}]}]} | 2023-12-05T03:49:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of migtissera/Tess-M-Creative-v1.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model migtissera/Tess-M-Creative-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T03:45:38.672992(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of migtissera/Tess-M-Creative-v1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-M-Creat... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of migtissera/Tess-M-Creative-v1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mode... | [
6,
24,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Tess-M-Creative-v1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtiss... |
83c368c41c88352cf932876699fb1fbbd17356ed | # Dataset Card for "librispeech960-encodec1024_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cmu-mlsp/librispeech960-encodec1024_asr | [
"region:us"
] | 2023-12-05T03:58:58+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation_other", "path": "data/validation_other-*"}, {"split": "test_other", "path": "data/test_other-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "audio_codes", "sequence": "string"}, {"name": "id", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1859401929, "num_examples": 281241}, {"name": "validation", "num_bytes": 10515210, "num_examples": 2703}, {"name": "test", "num_bytes": 10516648, "num_examples": 2620}, {"name": "validation_other", "num_bytes": 9974741, "num_examples": 2864}, {"name": "test_other", "num_bytes": 10389123, "num_examples": 2939}], "download_size": 0, "dataset_size": 1900797651}} | 2023-12-05T17:19:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "librispeech960-encodec1024_asr"
More Information needed | [
"# Dataset Card for \"librispeech960-encodec1024_asr\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"librispeech960-encodec1024_asr\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"librispeech960-encodec1024_asr\"\n\nMore Information needed"
] |
8b3ab603f3556dc30844e7c36bef322ac916c862 | # Dataset Card for "Customizable-Code-Assistant-Data"
## Dataset Summary
This dataset contains is a dummy Version of the Customizable Code Assistant Dataset.
## Supported Tasks and Leaderboards
Customizable Code Assistant is a dataset for code completion. The task is to predict the next token in a code snippet. The dataset is designed to be customizable, so that it can be used for different programming languages and different code completion tasks.
[More Information Needed] | ammarnasr/Customizable-Code-Assistant-Data | [
"region:us"
] | 2023-12-05T04:17:57+00:00 | {"dataset_info": {"features": [{"name": "repo_name", "dtype": "string"}, {"name": "repo_url", "dtype": "string"}, {"name": "repo_description", "dtype": "string"}, {"name": "repo_stars", "dtype": "int64"}, {"name": "repo_forks", "dtype": "int64"}, {"name": "repo_last_updated", "dtype": "string"}, {"name": "repo_created_at", "dtype": "string"}, {"name": "repo_size", "dtype": "int64"}, {"name": "repo_license", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "avg_line_length", "dtype": "float64"}, {"name": "max_line_length", "dtype": "int64"}, {"name": "alphnanum_fraction", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 2004792, "num_examples": 604}], "download_size": 174531, "dataset_size": 2004792}} | 2023-12-05T04:21:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Customizable-Code-Assistant-Data"
## Dataset Summary
This dataset contains is a dummy Version of the Customizable Code Assistant Dataset.
## Supported Tasks and Leaderboards
Customizable Code Assistant is a dataset for code completion. The task is to predict the next token in a code snippet. The dataset is designed to be customizable, so that it can be used for different programming languages and different code completion tasks.
| [
"# Dataset Card for \"Customizable-Code-Assistant-Data\"",
"## Dataset Summary\n\nThis dataset contains is a dummy Version of the Customizable Code Assistant Dataset.",
"## Supported Tasks and Leaderboards\n\nCustomizable Code Assistant is a dataset for code completion. The task is to predict the next token in ... | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Customizable-Code-Assistant-Data\"",
"## Dataset Summary\n\nThis dataset contains is a dummy Version of the Customizable Code Assistant Dataset.",
"## Supported Tasks and Leaderboards\n\nCustomizable Code Assistant is a dataset for code completion. The task is to p... | [
6,
19,
25,
69
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"Customizable-Code-Assistant-Data\"## Dataset Summary\n\nThis dataset contains is a dummy Version of the Customizable Code Assistant Dataset.## Supported Tasks and Leaderboards\n\nCustomizable Code Assistant is a dataset for code completion. The task is to predict th... |
75794087be02248e5d7e9c67b8f7a37d08e7e826 |
# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Yee-34B-200K-Chat](https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T04:15:54.776905](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat/blob/main/results_2023-12-05T04-15-54.776905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7397087702526806,
"acc_stderr": 0.028697152379174293,
"acc_norm": 0.749145830773331,
"acc_norm_stderr": 0.029232668522838182,
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.538842608150276,
"mc2_stderr": 0.015448158590971197
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893446,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6506671977693687,
"acc_stderr": 0.0047578490234119605,
"acc_norm": 0.8432583150766779,
"acc_norm_stderr": 0.003628140427399768
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.023108393799841326,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.023108393799841326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6375661375661376,
"acc_stderr": 0.024757473902752045,
"acc_norm": 0.6375661375661376,
"acc_norm_stderr": 0.024757473902752045
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8612903225806452,
"acc_stderr": 0.019662961321414027,
"acc_norm": 0.8612903225806452,
"acc_norm_stderr": 0.019662961321414027
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047926,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047926
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527046,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.0210206726808279,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.0210206726808279
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660077,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.026222235171477374,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.026222235171477374
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455386,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562586,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.010830724713134182,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.010830724713134182
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7195530726256983,
"acc_stderr": 0.015024083883322895,
"acc_norm": 0.7195530726256983,
"acc_norm_stderr": 0.015024083883322895
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787679,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.02423101337054109,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.02423101337054109
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098615,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.538842608150276,
"mc2_stderr": 0.015448158590971197
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.01128501375404745
},
"harness|gsm8k|5": {
"acc": 0.3479909021986353,
"acc_stderr": 0.013120581030382132
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat | [
"region:us"
] | 2023-12-05T04:18:43+00:00 | {"pretty_name": "Evaluation run of JosephusCheung/Yee-34B-200K-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [JosephusCheung/Yee-34B-200K-Chat](https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T04:15:54.776905](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat/blob/main/results_2023-12-05T04-15-54.776905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7397087702526806,\n \"acc_stderr\": 0.028697152379174293,\n \"acc_norm\": 0.749145830773331,\n \"acc_norm_stderr\": 0.029232668522838182,\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.538842608150276,\n \"mc2_stderr\": 0.015448158590971197\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893446,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6506671977693687,\n \"acc_stderr\": 0.0047578490234119605,\n \"acc_norm\": 0.8432583150766779,\n \"acc_norm_stderr\": 0.003628140427399768\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.023108393799841326,\n \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.023108393799841326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889774,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889774\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6375661375661376,\n \"acc_stderr\": 0.024757473902752045,\n \"acc_norm\": 0.6375661375661376,\n \"acc_norm_stderr\": 0.024757473902752045\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8612903225806452,\n \"acc_stderr\": 0.019662961321414027,\n \"acc_norm\": 0.8612903225806452,\n \"acc_norm_stderr\": 0.019662961321414027\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047926,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047926\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527046,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527046\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.0210206726808279,\n \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.0210206726808279\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660077,\n \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660077\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.026222235171477374,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.026222235171477374\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455386,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455386\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562586,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562586\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n \"acc_stderr\": 0.010830724713134182,\n \"acc_norm\": 0.8978288633461047,\n \"acc_norm_stderr\": 0.010830724713134182\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7195530726256983,\n \"acc_stderr\": 0.015024083883322895,\n \"acc_norm\": 0.7195530726256983,\n \"acc_norm_stderr\": 0.015024083883322895\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n \"acc_stderr\": 0.012689708167787679,\n \"acc_norm\": 0.5560625814863103,\n \"acc_norm_stderr\": 0.012689708167787679\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.02423101337054109,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.02423101337054109\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098615,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098615\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.538842608150276,\n \"mc2_stderr\": 0.015448158590971197\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.01128501375404745\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \"acc_stderr\": 0.013120581030382132\n }\n}\n```", "repo_url": "https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|arc:challenge|25_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|gsm8k|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hellaswag|10_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["**/details_harness|winogrande|5_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T04-15-54.776905.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T04_15_54.776905", "path": ["results_2023-12-05T04-15-54.776905.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T04-15-54.776905.parquet"]}]}]} | 2023-12-05T04:19:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model JosephusCheung/Yee-34B-200K-Chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T04:15:54.776905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model JosephusCheung/Yee-34B... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of mod... | [
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Joseph... |
9bcd0c4f14e99c7c672e143dd88ab9bb32c3627f | # Dataset Card for "kor_dbpedia_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
Lehmann, Jens, Robert Isele, Max Jakob, Anja Jentzsch, Dimitris Kontokostas, Pablo N. Mendes, Sebastian Hellmann et al. "DBpedia–a large-scale, multilingual knowledge base extracted from Wikipedia." Semantic web 6, no. 2 (2015): 167-195.
``` | KETI-AIR/kor_dbpedia_14 | [
"license:cc-by-sa-3.0",
"region:us"
] | 2023-12-05T04:28:01+00:00 | {"license": "cc-by-sa-3.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "title", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "label", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 207331112, "num_examples": 560000}, {"name": "test", "num_bytes": 25970187, "num_examples": 70000}], "download_size": 136871622, "dataset_size": 233301299}} | 2023-12-05T04:29:58+00:00 | [] | [] | TAGS
#license-cc-by-sa-3.0 #region-us
| # Dataset Card for "kor_dbpedia_14"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_dbpedia_14\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-cc-by-sa-3.0 #region-us \n",
"# Dataset Card for \"kor_dbpedia_14\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
17,
16,
6
] | [
"passage: TAGS\n#license-cc-by-sa-3.0 #region-us \n# Dataset Card for \"kor_dbpedia_14\"\n\nMore Information needed# Source Data Citation Information"
] |
4c986f8452a87f929b30179f9c525ca256d31662 | # Dataset Card for "MedQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hippocrates/MedQA_train | [
"region:us"
] | 2023-12-05T04:34:53+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28990738, "num_examples": 10178}, {"name": "valid", "num_bytes": 3622152, "num_examples": 1272}, {"name": "test", "num_bytes": 3678270, "num_examples": 1273}], "download_size": 14570611, "dataset_size": 36291160}} | 2023-12-05T20:29:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "MedQA_train"
More Information needed | [
"# Dataset Card for \"MedQA_train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MedQA_train\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"MedQA_train\"\n\nMore Information needed"
] |
22b0ec2662c73e7c34b3a8488367d2d87078320a | # Dataset Card for "MedMCQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hippocrates/MedMCQA_train | [
"region:us"
] | 2023-12-05T04:53:04+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 139904836, "num_examples": 182822}, {"name": "valid", "num_bytes": 3340728, "num_examples": 4183}, {"name": "test", "num_bytes": 3340728, "num_examples": 4183}], "download_size": 52413017, "dataset_size": 146586292}} | 2023-12-05T04:56:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "MedMCQA_train"
More Information needed | [
"# Dataset Card for \"MedMCQA_train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MedMCQA_train\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"MedMCQA_train\"\n\nMore Information needed"
] |
47939f8766f0db335072d012e9bddec02e26de42 |
# Dataset Card for Evaluation run of Enoch/llama-65b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enoch/llama-65b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enoch/llama-65b-hf](https://huggingface.co/Enoch/llama-65b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enoch__llama-65b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T05:06:29.042599](https://huggingface.co/datasets/open-llm-leaderboard/details_Enoch__llama-65b-hf/blob/main/results_2023-12-05T05-06-29.042599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6380517777268255,
"acc_stderr": 0.032178718879849834,
"acc_norm": 0.6421210460838432,
"acc_norm_stderr": 0.0328302725617492,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43425303494253065,
"mc2_stderr": 0.013768101142659904
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449708,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6650069707229636,
"acc_stderr": 0.004710234188047369,
"acc_norm": 0.8608842859988051,
"acc_norm_stderr": 0.003453599726736566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697032,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697032
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121448,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121448
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099864,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489124,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489124
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.016712467441702517,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.016712467441702517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426125,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426125
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4941329856584094,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.4941329856584094,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291282,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291282
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43425303494253065,
"mc2_stderr": 0.013768101142659904
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706175
},
"harness|gsm8k|5": {
"acc": 0.44806671721000757,
"acc_stderr": 0.013697992668274522
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_Enoch__llama-65b-hf | [
"region:us"
] | 2023-12-05T05:09:08+00:00 | {"pretty_name": "Evaluation run of Enoch/llama-65b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [Enoch/llama-65b-hf](https://huggingface.co/Enoch/llama-65b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enoch__llama-65b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T05:06:29.042599](https://huggingface.co/datasets/open-llm-leaderboard/details_Enoch__llama-65b-hf/blob/main/results_2023-12-05T05-06-29.042599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6380517777268255,\n \"acc_stderr\": 0.032178718879849834,\n \"acc_norm\": 0.6421210460838432,\n \"acc_norm_stderr\": 0.0328302725617492,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43425303494253065,\n \"mc2_stderr\": 0.013768101142659904\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449708,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6650069707229636,\n \"acc_stderr\": 0.004710234188047369,\n \"acc_norm\": 0.8608842859988051,\n \"acc_norm_stderr\": 0.003453599726736566\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.03794012674697032,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.03794012674697032\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121448,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121448\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633506,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633506\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099864,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099864\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.03324708911809117,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.03324708911809117\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489124,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489124\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n \"acc_stderr\": 0.016712467441702517,\n \"acc_norm\": 0.48268156424581005,\n \"acc_norm_stderr\": 0.016712467441702517\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426125,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426125\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4941329856584094,\n \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.4941329856584094,\n \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291282,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291282\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43425303494253065,\n \"mc2_stderr\": 0.013768101142659904\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706175\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44806671721000757,\n \"acc_stderr\": 0.013697992668274522\n }\n}\n```", "repo_url": "https://huggingface.co/Enoch/llama-65b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|arc:challenge|25_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|gsm8k|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hellaswag|10_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["**/details_harness|winogrande|5_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T05-06-29.042599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T05_06_29.042599", "path": ["results_2023-12-05T05-06-29.042599.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T05-06-29.042599.parquet"]}]}]} | 2023-12-05T05:11:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Enoch/llama-65b-hf
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Enoch/llama-65b-hf on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T05:06:29.042599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of Enoch/llama-65b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Enoch/llama-65b-hf on the Open LLM L... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Enoch/llama-65b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Enoch/llama... | [
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Enoch/llama-65b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Enoch/llama-65b-hf o... |
85a6dbdd9b27fa2fe54d4a6c45e5cbbda28faa41 | # Dataset Card for "kor_glue"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
``` | KETI-AIR/kor_glue | [
"license:cc-by-4.0",
"region:us"
] | 2023-12-05T05:42:54+00:00 | {"license": "cc-by-4.0", "dataset_info": [{"config_name": "cola", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "label", "dtype": "int32"}, {"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 569511, "num_examples": 8551}, {"name": "validation", "num_bytes": 72661, "num_examples": 1043}, {"name": "test", "num_bytes": 72979, "num_examples": 1063}], "download_size": 381894, "dataset_size": 715151}, {"config_name": "mrpc", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "sentence1", "dtype": "string"}, {"name": "sentence2", "dtype": "string"}, {"name": "label", "dtype": "int32"}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 1078522, "num_examples": 3668}, {"name": "validation", "num_bytes": 120306, "num_examples": 408}, {"name": "test", "num_bytes": 504069, "num_examples": 1725}], "download_size": 1176356, "dataset_size": 1702897}, {"config_name": "qnli", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "label", "dtype": "int32"}, {"name": "question", "dtype": "string"}, {"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28343211, "num_examples": 104743}, {"name": "validation", "num_bytes": 1507016, "num_examples": 5463}, {"name": "test", "num_bytes": 1510880, "num_examples": 5463}], "download_size": 21097078, "dataset_size": 31361107}, {"config_name": "qqp", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "question1", "dtype": "string"}, {"name": "question2", "dtype": "string"}, {"name": "label", "dtype": "int32"}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 64564524, "num_examples": 363846}], "download_size": 40798086, "dataset_size": 64564524}, {"config_name": "wnli", "features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "sentence1", "dtype": "string"}, {"name": "sentence2", "dtype": "string"}, {"name": "label", "dtype": "int32"}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 132171, "num_examples": 635}, {"name": "validation", "num_bytes": 15331, "num_examples": 71}, {"name": "test", "num_bytes": 47430, "num_examples": 146}], "download_size": 80151, "dataset_size": 194932}], "configs": [{"config_name": "cola", "data_files": [{"split": "train", "path": "cola/train-*"}, {"split": "validation", "path": "cola/validation-*"}, {"split": "test", "path": "cola/test-*"}]}, {"config_name": "mrpc", "data_files": [{"split": "train", "path": "mrpc/train-*"}, {"split": "validation", "path": "mrpc/validation-*"}, {"split": "test", "path": "mrpc/test-*"}]}, {"config_name": "qnli", "data_files": [{"split": "train", "path": "qnli/train-*"}, {"split": "validation", "path": "qnli/validation-*"}, {"split": "test", "path": "qnli/test-*"}]}, {"config_name": "qqp", "data_files": [{"split": "train", "path": "qqp/train-*"}]}, {"config_name": "wnli", "data_files": [{"split": "train", "path": "wnli/train-*"}, {"split": "validation", "path": "wnli/validation-*"}, {"split": "test", "path": "wnli/test-*"}]}]} | 2023-12-05T06:00:09+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
| # Dataset Card for "kor_glue"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_glue\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"# Dataset Card for \"kor_glue\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
15,
14,
6
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n# Dataset Card for \"kor_glue\"\n\nMore Information needed# Source Data Citation Information"
] |
5e2444c1db6c05d4e5ea3a1cf29e5744f0abaa66 | # Dataset Card for "Nexusflow/Function_Call_Definitions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
| Nexusflow/Function_Call_Definitions | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-12-05T06:04:42+00:00 | {"license": "cc-by-nc-sa-4.0", "dataset_info": [{"config_name": "CVECPE", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8237, "num_examples": 2}], "download_size": 13384, "dataset_size": 8237}, {"config_name": "CVECPE_Multi (Nested)", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17425, "num_examples": 20}], "download_size": 15503, "dataset_size": 17425}, {"config_name": "Climate", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2905, "num_examples": 8}], "download_size": 4163, "dataset_size": 2905}, {"config_name": "OTX", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7040, "num_examples": 9}], "download_size": 8407, "dataset_size": 7040}, {"config_name": "Places", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2460, "num_examples": 7}], "download_size": 5759, "dataset_size": 2460}, {"config_name": "VT_Multi (Nested)", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18137, "num_examples": 29}], "download_size": 13810, "dataset_size": 18137}, {"config_name": "VT_Multi (Parallel)", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18137, "num_examples": 29}], "download_size": 13810, "dataset_size": 18137}, {"config_name": "VirusTotal", "features": [{"name": "function_calls", "dtype": "string"}, {"name": "descriptions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11501, "num_examples": 12}], "download_size": 11668, "dataset_size": 11501}], "configs": [{"config_name": "CVECPE", "data_files": [{"split": "train", "path": "CVECPE/train-*"}]}, {"config_name": "CVECPE_Multi (Nested)", "data_files": [{"split": "train", "path": "CVECPE_Multi (Nested)/train-*"}]}, {"config_name": "Climate", "data_files": [{"split": "train", "path": "Climate/train-*"}]}, {"config_name": "OTX", "data_files": [{"split": "train", "path": "OTX/train-*"}]}, {"config_name": "Places", "data_files": [{"split": "train", "path": "Places/train-*"}]}, {"config_name": "VT_Multi (Nested)", "data_files": [{"split": "train", "path": "VT_Multi (Nested)/train-*"}]}, {"config_name": "VT_Multi (Parallel)", "data_files": [{"split": "train", "path": "VT_Multi (Parallel)/train-*"}]}, {"config_name": "VirusTotal", "data_files": [{"split": "train", "path": "VirusTotal/train-*"}]}]} | 2023-12-05T07:08:39+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-4.0 #region-us
| # Dataset Card for "Nexusflow/Function_Call_Definitions"
More Information needed
| [
"# Dataset Card for \"Nexusflow/Function_Call_Definitions\"\n\nMore Information needed"
] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n",
"# Dataset Card for \"Nexusflow/Function_Call_Definitions\"\n\nMore Information needed"
] | [
19,
23
] | [
"passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n# Dataset Card for \"Nexusflow/Function_Call_Definitions\"\n\nMore Information needed"
] |
cdf75c523903a6429506c1adbabb28175979f112 |
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-67b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T05:32:04.370506](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-chat/blob/main/results_2024-01-20T05-32-04.370506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7202833490892042,
"acc_stderr": 0.029579907486427835,
"acc_norm": 0.7235978318716265,
"acc_norm_stderr": 0.030155588132811505,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5583209009287327,
"mc2_stderr": 0.014945999339089985
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094083,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277371
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.004655059308602615,
"acc_norm": 0.8679545907189803,
"acc_norm_stderr": 0.0033784824887488673
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210324984,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210324984
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.034765996075164785,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.034765996075164785
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5291005291005291,
"acc_stderr": 0.025707658614154947,
"acc_norm": 0.5291005291005291,
"acc_norm_stderr": 0.025707658614154947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6059113300492611,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.6059113300492611,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295141,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7282051282051282,
"acc_stderr": 0.022556551010132354,
"acc_norm": 0.7282051282051282,
"acc_norm_stderr": 0.022556551010132354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8151260504201681,
"acc_stderr": 0.025215992877954202,
"acc_norm": 0.8151260504201681,
"acc_norm_stderr": 0.025215992877954202
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.012809780081878929,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.012809780081878929
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.820627802690583,
"acc_stderr": 0.0257498195691928,
"acc_norm": 0.820627802690583,
"acc_norm_stderr": 0.0257498195691928
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917949,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917949
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371037,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371037
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877661995,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877661995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.02255244778047803,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.02255244778047803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544536,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5834419817470665,
"acc_stderr": 0.01259115324505739,
"acc_norm": 0.5834419817470665,
"acc_norm_stderr": 0.01259115324505739
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02518778666022726,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02518778666022726
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.015856152189980245,
"acc_norm": 0.8104575163398693,
"acc_norm_stderr": 0.015856152189980245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5583209009287327,
"mc2_stderr": 0.014945999339089985
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719764
},
"harness|gsm8k|5": {
"acc": 0.623199393479909,
"acc_stderr": 0.013347858757829154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-chat | [
"region:us"
] | 2023-12-05T06:09:20+00:00 | {"pretty_name": "Evaluation run of deepseek-ai/deepseek-llm-67b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-67b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T05:32:04.370506](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-67b-chat/blob/main/results_2024-01-20T05-32-04.370506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7202833490892042,\n \"acc_stderr\": 0.029579907486427835,\n \"acc_norm\": 0.7235978318716265,\n \"acc_norm_stderr\": 0.030155588132811505,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5583209009287327,\n \"mc2_stderr\": 0.014945999339089985\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094083,\n \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277371\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n \"acc_stderr\": 0.004655059308602615,\n \"acc_norm\": 0.8679545907189803,\n \"acc_norm_stderr\": 0.0033784824887488673\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210324984,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210324984\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5291005291005291,\n \"acc_stderr\": 0.025707658614154947,\n \"acc_norm\": 0.5291005291005291,\n \"acc_norm_stderr\": 0.025707658614154947\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6059113300492611,\n \"acc_stderr\": 0.03438157967036543,\n \"acc_norm\": 0.6059113300492611,\n \"acc_norm_stderr\": 0.03438157967036543\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295141,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295141\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.022556551010132354,\n \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.022556551010132354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8151260504201681,\n \"acc_stderr\": 0.025215992877954202,\n \"acc_norm\": 0.8151260504201681,\n \"acc_norm_stderr\": 0.025215992877954202\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.012809780081878929,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.012809780081878929\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.820627802690583,\n \"acc_stderr\": 0.0257498195691928,\n \"acc_norm\": 0.820627802690583,\n \"acc_norm_stderr\": 0.0257498195691928\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917949,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917949\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n \"acc_stderr\": 0.016709709877661995,\n \"acc_norm\": 0.48044692737430167,\n \"acc_norm_stderr\": 0.016709709877661995\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n \"acc_stderr\": 0.02255244778047803,\n \"acc_norm\": 0.8038585209003215,\n \"acc_norm_stderr\": 0.02255244778047803\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544536,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5834419817470665,\n \"acc_stderr\": 0.01259115324505739,\n \"acc_norm\": 0.5834419817470665,\n \"acc_norm_stderr\": 0.01259115324505739\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02518778666022726,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02518778666022726\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8104575163398693,\n \"acc_stderr\": 0.015856152189980245,\n \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.015856152189980245\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5583209009287327,\n \"mc2_stderr\": 0.014945999339089985\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.623199393479909,\n \"acc_stderr\": 0.013347858757829154\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|arc:challenge|25_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|gsm8k|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hellaswag|10_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T06-06-20.627396.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T05-32-04.370506.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["**/details_harness|winogrande|5_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["**/details_harness|winogrande|5_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T05-32-04.370506.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T06_06_20.627396", "path": ["results_2023-12-05T06-06-20.627396.parquet"]}, {"split": "2024_01_20T05_32_04.370506", "path": ["results_2024-01-20T05-32-04.370506.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T05-32-04.370506.parquet"]}]}]} | 2024-01-20T05:34:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-chat
Dataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-20T05:32:04.370506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe da... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the ... | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-67b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-67b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of t... |
cf666de8b31a569e9dea63c716b0c145d763df99 | # Dataset Card for "kor_hellaswag"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{zellers2019hellaswag,
title={HellaSwag: Can a Machine Really Finish Your Sentence?},
author={Zellers, Rowan and Holtzman, Ari and Bisk, Yonatan and Farhadi, Ali and Choi, Yejin},
booktitle ={Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics},
year={2019}
}
``` | KETI-AIR/kor_hellaswag | [
"license:mit",
"region:us"
] | 2023-12-05T06:22:14+00:00 | {"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "joined", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 105739666, "num_examples": 39905}, {"name": "validation", "num_bytes": 27367976, "num_examples": 10042}, {"name": "test", "num_bytes": 26340397, "num_examples": 10003}], "download_size": 69994643, "dataset_size": 159448039}} | 2023-12-05T06:23:43+00:00 | [] | [] | TAGS
#license-mit #region-us
| # Dataset Card for "kor_hellaswag"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_hellaswag\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-mit #region-us \n",
"# Dataset Card for \"kor_hellaswag\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
11,
15,
6
] | [
"passage: TAGS\n#license-mit #region-us \n# Dataset Card for \"kor_hellaswag\"\n\nMore Information needed# Source Data Citation Information"
] |
a8d62094a65ca55e32ed3c9e8bbe16db67ec46f7 | # Dataset Card for "librispeech960-wavlm-large-km1000_asr_tokenized_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cmu-mlsp/librispeech960-wavlm-large-km1000_asr_tokenized_final | [
"region:us"
] | 2023-12-05T06:28:08+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "validation_tts", "path": "data/validation_tts-*"}, {"split": "test", "path": "data/test-*"}, {"split": "test_tts", "path": "data/test_tts-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 4809631893, "num_examples": 281241}, {"name": "validation", "num_bytes": 54319982, "num_examples": 5406}, {"name": "validation_tts", "num_bytes": 27159991, "num_examples": 2703}, {"name": "test", "num_bytes": 27180211, "num_examples": 2620}, {"name": "test_tts", "num_bytes": 27180211, "num_examples": 2620}], "download_size": 506035712, "dataset_size": 4945472288}} | 2023-12-05T14:33:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "librispeech960-wavlm-large-km1000_asr_tokenized_final"
More Information needed | [
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr_tokenized_final\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"librispeech960-wavlm-large-km1000_asr_tokenized_final\"\n\nMore Information needed"
] | [
6,
33
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"librispeech960-wavlm-large-km1000_asr_tokenized_final\"\n\nMore Information needed"
] |
2cb9dc80ba0f4394ff1e9e7d72c82068cdbf4f88 | # Dataset Card for "vietnamese-corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tiennv/vietnamese-corpus | [
"region:us"
] | 2023-12-05T06:34:08+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8142342251, "num_examples": 19233991}], "download_size": 4233458271, "dataset_size": 8142342251}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T06:59:38+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vietnamese-corpus"
More Information needed | [
"# Dataset Card for \"vietnamese-corpus\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vietnamese-corpus\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vietnamese-corpus\"\n\nMore Information needed"
] |
ec6ae40e2a472b0585044c035f3846561afbce22 | # Dataset Card for "kor_nq_open"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@article{doi:10.1162/tacl\_a\_00276,
author = {Kwiatkowski, Tom and Palomaki, Jennimaria and Redfield, Olivia and Collins, Michael and Parikh, Ankur and Alberti, Chris and Epstein, Danielle and Polosukhin, Illia and Devlin, Jacob and Lee, Kenton and Toutanova, Kristina and Jones, Llion and Kelcey, Matthew and Chang, Ming-Wei and Dai, Andrew M. and Uszkoreit, Jakob and Le, Quoc and Petrov, Slav},
title = {Natural Questions: A Benchmark for Question Answering Research},
journal = {Transactions of the Association for Computational Linguistics},
volume = {7},
number = {},
pages = {453-466},
year = {2019},
doi = {10.1162/tacl\_a\_00276},
URL = {
https://doi.org/10.1162/tacl_a_00276
},
eprint = {
https://doi.org/10.1162/tacl_a_00276
},
abstract = { We present the Natural Questions corpus, a question answering data set. Questions consist of real anonymized, aggregated queries issued to the Google search engine. An annotator is presented with a question along with a Wikipedia page from the top 5 search results, and annotates a long answer (typically a paragraph) and a short answer (one or more entities) if present on the page, or marks null if no long/short answer is present. The public release consists of 307,373 training examples with single annotations; 7,830 examples with 5-way annotations for development data; and a further 7,842 examples with 5-way annotated sequestered as test data. We present experiments validating quality of the data. We also describe analysis of 25-way annotations on 302 examples, giving insights into human variability on the annotation task. We introduce robust metrics for the purposes of evaluating question answering systems; demonstrate high human upper bounds on these metrics; and establish baseline results using competitive methods drawn from related literature. }
}
@inproceedings{lee-etal-2019-latent,
title = "Latent Retrieval for Weakly Supervised Open Domain Question Answering",
author = "Lee, Kenton and
Chang, Ming-Wei and
Toutanova, Kristina",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1612",
doi = "10.18653/v1/P19-1612",
pages = "6086--6096",
abstract = "Recent work on open domain question answering (QA) assumes strong supervision of the supporting evidence and/or assumes a blackbox information retrieval (IR) system to retrieve evidence candidates. We argue that both are suboptimal, since gold evidence is not always available, and QA is fundamentally different from IR. We show for the first time that it is possible to jointly learn the retriever and reader from question-answer string pairs and without any IR system. In this setting, evidence retrieval from all of Wikipedia is treated as a latent variable. Since this is impractical to learn from scratch, we pre-train the retriever with an Inverse Cloze Task. We evaluate on open versions of five QA datasets. On datasets where the questioner already knows the answer, a traditional IR system such as BM25 is sufficient. On datasets where a user is genuinely seeking an answer, we show that learned retrieval is crucial, outperforming BM25 by up to 19 points in exact match.",
}
``` | KETI-AIR/kor_nq_open | [
"license:cc-by-sa-3.0",
"region:us"
] | 2023-12-05T06:35:09+00:00 | {"license": "cc-by-sa-3.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "question", "dtype": "string"}, {"name": "answer", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 8520218, "num_examples": 87925}, {"name": "validation", "num_bytes": 394518, "num_examples": 3610}], "download_size": 5925491, "dataset_size": 8914736}} | 2023-12-05T06:39:46+00:00 | [] | [] | TAGS
#license-cc-by-sa-3.0 #region-us
| # Dataset Card for "kor_nq_open"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_nq_open\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-cc-by-sa-3.0 #region-us \n",
"# Dataset Card for \"kor_nq_open\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
17,
16,
6
] | [
"passage: TAGS\n#license-cc-by-sa-3.0 #region-us \n# Dataset Card for \"kor_nq_open\"\n\nMore Information needed# Source Data Citation Information"
] |
628df753856d6e409a470753253a43d45852ea33 | # Dataset Card for "fin_instruct_dpo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gagan3012/fin_instruct_dpo | [
"region:us"
] | 2023-12-05T06:51:29+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 135251054.04366517, "num_examples": 42601}, {"name": "test", "num_bytes": 1368352.9563348205, "num_examples": 431}], "download_size": 76818308, "dataset_size": 136619407.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-05T09:27:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "fin_instruct_dpo"
More Information needed | [
"# Dataset Card for \"fin_instruct_dpo\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"fin_instruct_dpo\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"fin_instruct_dpo\"\n\nMore Information needed"
] |
044ef56a1f779f1709dc742f4fbce6de8e943921 | ## Data
This datset is uploaded as a .tar.gz file that was orginally used to finetune a stable diffusion model.
It consists of 11 Renaissance era portraits of human figures whom are often rendered in dynamic poses, showing expression amd possibly using gesture.
Renaissance portraits are characterized by realism, with the subject being the focus of the work and the background being plain.
Additionally, the file includes a .csv file with two columns, one that serves as a placeholder for an image path and the other for textual description used in training the model.
Image Format: .jpg <br>
Image Size: 256 x 256px | morj/renaissance_portraits | [
"task_categories:text-to-image",
"size_categories:n<1K",
"language:en",
"license:cc-by-nc-sa-4.0",
"art",
"renaissance",
"finetune",
"doi:10.57967/hf/1427",
"region:us"
] | 2023-12-05T07:07:19+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "pretty_name": "renaissance_portraits", "tags": ["art", "renaissance", "finetune"]} | 2024-02-05T01:13:43+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-to-image #size_categories-n<1K #language-English #license-cc-by-nc-sa-4.0 #art #renaissance #finetune #doi-10.57967/hf/1427 #region-us
| ## Data
This datset is uploaded as a .URL file that was orginally used to finetune a stable diffusion model.
It consists of 11 Renaissance era portraits of human figures whom are often rendered in dynamic poses, showing expression amd possibly using gesture.
Renaissance portraits are characterized by realism, with the subject being the focus of the work and the background being plain.
Additionally, the file includes a .csv file with two columns, one that serves as a placeholder for an image path and the other for textual description used in training the model.
Image Format: .jpg <br>
Image Size: 256 x 256px | [
"## Data\nThis datset is uploaded as a .URL file that was orginally used to finetune a stable diffusion model.\nIt consists of 11 Renaissance era portraits of human figures whom are often rendered in dynamic poses, showing expression amd possibly using gesture. \nRenaissance portraits are characterized by realism, ... | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #language-English #license-cc-by-nc-sa-4.0 #art #renaissance #finetune #doi-10.57967/hf/1427 #region-us \n",
"## Data\nThis datset is uploaded as a .URL file that was orginally used to finetune a stable diffusion model.\nIt consists of 11 Renaissance era... | [
66,
148
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #language-English #license-cc-by-nc-sa-4.0 #art #renaissance #finetune #doi-10.57967/hf/1427 #region-us \n## Data\nThis datset is uploaded as a .URL file that was orginally used to finetune a stable diffusion model.\nIt consists of 11 Renaissance ... |
3c9f67b2197f79651efe63210dace2d48cc8e026 | # Dataset Card for "kor_piqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{Bisk2020,
author = {Yonatan Bisk and Rowan Zellers and
Ronan Le Bras and Jianfeng Gao
and Yejin Choi},
title = {PIQA: Reasoning about Physical Commonsense in
Natural Language},
booktitle = {Thirty-Fourth AAAI Conference on
Artificial Intelligence},
year = {2020},
}
``` | KETI-AIR/kor_piqa | [
"license:afl-3.0",
"region:us"
] | 2023-12-05T07:23:15+00:00 | {"license": "afl-3.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 5065072, "num_examples": 16113}, {"name": "validation", "num_bytes": 563144, "num_examples": 1838}, {"name": "test", "num_bytes": 935859, "num_examples": 3084}], "download_size": 3857117, "dataset_size": 6564075}} | 2023-12-05T07:27:32+00:00 | [] | [] | TAGS
#license-afl-3.0 #region-us
| # Dataset Card for "kor_piqa"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_piqa\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-afl-3.0 #region-us \n",
"# Dataset Card for \"kor_piqa\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
14,
14,
6
] | [
"passage: TAGS\n#license-afl-3.0 #region-us \n# Dataset Card for \"kor_piqa\"\n\nMore Information needed# Source Data Citation Information"
] |
d41039c93cafef6ac93fbda58770455e1bc17697 | https://huggingface.co/datasets/maywell/ko_Ultrafeedback_binarized 에서 번역 작업 오류로 잘못 번역된 레코드만 삭제함.<br><br>
예) "두 사람 간의 대화가 주어집니다.'Person1:'과 'Person2:'는 각자의 대화를 구분하는 데 사용됩니다. 대화에 2개 이상의 고유 감정이 존재하는지 분류해야 합니다. 대화에 2개 이상의 고유 감정이 존재하면 출력은 '1'로 분류되고, 그렇지 않으면 '0'으로 분류해야 합니다.예시 입력:Person1: 안녕하세요, 마이크. 뭐 물어볼 게 있어요? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슈"
| hankang2023/Ultrafeedback_binarized.ko.maywell-mini | [
"region:us"
] | 2023-12-05T07:44:43+00:00 | {} | 2023-12-06T00:22:47+00:00 | [] | [] | TAGS
#region-us
| URL 에서 번역 작업 오류로 잘못 번역된 레코드만 삭제함.<br><br>
예) "두 사람 간의 대화가 주어집니다.'Person1:'과 'Person2:'는 각자의 대화를 구분하는 데 사용됩니다. 대화에 2개 이상의 고유 감정이 존재하는지 분류해야 합니다. 대화에 2개 이상의 고유 감정이 존재하면 출력은 '1'로 분류되고, 그렇지 않으면 '0'으로 분류해야 합니다.예시 입력:Person1: 안녕하세요, 마이크. 뭐 물어볼 게 있어요? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슨 일이야? 그럼 무슈"
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
18eba943372da841656b8044eeccf94c7f64be31 |
# Sycophancy Rotten Tomatoes Dataset
The generated dataset includes a text (chat between a human and an assistant), the sycophancy of the exchange, and additional information.
### Dataset Structure
The dataset is structured as follows:
- `text`: The generated prompt text of the chat between the human and the assistant.
- `assistant_opinion`: The assistant's opinion, converted to a label (i.e. its final answer.
- `human_opinion`: The human's opinion, converted to a label.
- `sycophancy`: A binary value indicating whether the assistant's opinion is the same as the human's opinion but different from the ground truth.
- `comment`: The initial comment from Rotten Tomatoes.
- `ground_truth`: The actual label of the initial comment.
- `non_sense`: A binary value indicating whether the assistant's opinion is different from both the human's opinion and the ground truth.
> The `non_sense` column reports instances where the assistant provides an answer that differs from the ground truth, even though the human has given their opinion that matches the correct label. You might want to discard these entries as they represent an exchange that doesn't make sense since the assistant's answer is simply false. | romaingrx/sycophancy_rotten_tomatoes | [
"task_categories:zero-shot-classification",
"task_categories:text-classification",
"language:en",
"license:openrail",
"region:us"
] | 2023-12-05T07:48:06+00:00 | {"language": ["en"], "license": "openrail", "task_categories": ["zero-shot-classification", "text-classification"]} | 2023-12-05T17:55:10+00:00 | [] | [
"en"
] | TAGS
#task_categories-zero-shot-classification #task_categories-text-classification #language-English #license-openrail #region-us
|
# Sycophancy Rotten Tomatoes Dataset
The generated dataset includes a text (chat between a human and an assistant), the sycophancy of the exchange, and additional information.
### Dataset Structure
The dataset is structured as follows:
- 'text': The generated prompt text of the chat between the human and the assistant.
- 'assistant_opinion': The assistant's opinion, converted to a label (i.e. its final answer.
- 'human_opinion': The human's opinion, converted to a label.
- 'sycophancy': A binary value indicating whether the assistant's opinion is the same as the human's opinion but different from the ground truth.
- 'comment': The initial comment from Rotten Tomatoes.
- 'ground_truth': The actual label of the initial comment.
- 'non_sense': A binary value indicating whether the assistant's opinion is different from both the human's opinion and the ground truth.
> The 'non_sense' column reports instances where the assistant provides an answer that differs from the ground truth, even though the human has given their opinion that matches the correct label. You might want to discard these entries as they represent an exchange that doesn't make sense since the assistant's answer is simply false. | [
"# Sycophancy Rotten Tomatoes Dataset\n\nThe generated dataset includes a text (chat between a human and an assistant), the sycophancy of the exchange, and additional information.",
"### Dataset Structure\n\nThe dataset is structured as follows:\n\n- 'text': The generated prompt text of the chat between the human... | [
"TAGS\n#task_categories-zero-shot-classification #task_categories-text-classification #language-English #license-openrail #region-us \n",
"# Sycophancy Rotten Tomatoes Dataset\n\nThe generated dataset includes a text (chat between a human and an assistant), the sycophancy of the exchange, and additional informati... | [
40,
42,
257
] | [
"passage: TAGS\n#task_categories-zero-shot-classification #task_categories-text-classification #language-English #license-openrail #region-us \n# Sycophancy Rotten Tomatoes Dataset\n\nThe generated dataset includes a text (chat between a human and an assistant), the sycophancy of the exchange, and additional inform... |
dc2592fa912c7144b1b2f1aea6b9ac1a869f5900 | # Dataset Card for "thai_sample_200k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wannaphong/thai_sample_200k | [
"region:us"
] | 2023-12-05T07:51:20+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1156975584, "num_examples": 200000}], "download_size": 453690863, "dataset_size": 1156975584}} | 2023-12-05T07:53:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "thai_sample_200k"
More Information needed | [
"# Dataset Card for \"thai_sample_200k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"thai_sample_200k\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"thai_sample_200k\"\n\nMore Information needed"
] |
be223ef972e71eec45cf4c21bffe5685e67c6da7 |
## 食品安全领域指令微调数据
* 包含两个任务:多文档QA、论文QA
* 文档数据来自中国食品安全国标、教材、综述论文 | yuyijiong/FoodSafe-Doc-QA-Chinese | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:zh",
"license:cc-by-nc-4.0",
"region:us"
] | 2023-12-05T08:02:16+00:00 | {"language": ["zh"], "license": "cc-by-nc-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]} | 2023-12-05T09:05:29+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-nc-4.0 #region-us
|
## 食品安全领域指令微调数据
* 包含两个任务:多文档QA、论文QA
* 文档数据来自中国食品安全国标、教材、综述论文 | [
"## 食品安全领域指令微调数据\n* 包含两个任务:多文档QA、论文QA\n* 文档数据来自中国食品安全国标、教材、综述论文"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-nc-4.0 #region-us \n",
"## 食品安全领域指令微调数据\n* 包含两个任务:多文档QA、论文QA\n* 文档数据来自中国食品安全国标、教材、综述论文"
] | [
45,
37
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Chinese #license-cc-by-nc-4.0 #region-us \n## 食品安全领域指令微调数据\n* 包含两个任务:多文档QA、论文QA\n* 文档数据来自中国食品安全国标、教材、综述论文"
] |
cfab73689709802342534d140fd816a6bcc4cbcb | This dataset is for the "DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs" paper.
@article{liang2023drugchat,
title={DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs},
author={Liang, Youwei and Zhang, Ruiyi and Zhang, li and Xie, Pengtao},
journal={TechRxiv},
year={2023}
}
There are no changes in data; only added datasets class to download the set from HF and generate split from data in .json files. | avaliev/drugchat | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"license:bsd-3-clause",
"biology",
"chemistry",
"medical",
"region:us"
] | 2023-12-05T08:04:57+00:00 | {"language": ["en"], "license": "bsd-3-clause", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "DrugChat Dataset", "tags": ["biology", "chemistry", "medical"]} | 2023-12-06T11:00:15+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-bsd-3-clause #biology #chemistry #medical #region-us
| This dataset is for the "DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs" paper.
@article{liang2023drugchat,
title={DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs},
author={Liang, Youwei and Zhang, Ruiyi and Zhang, li and Xie, Pengtao},
journal={TechRxiv},
year={2023}
}
There are no changes in data; only added datasets class to download the set from HF and generate split from data in .json files. | [] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-bsd-3-clause #biology #chemistry #medical #region-us \n"
] | [
54
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-bsd-3-clause #biology #chemistry #medical #region-us \n"
] |
53b33d247947b14eaa62dbcd661a2b20c7ff1604 | # Dataset Card for "thai_sample_500k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wannaphong/thai_sample_500k | [
"region:us"
] | 2023-12-05T08:06:26+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2878877988, "num_examples": 500000}], "download_size": 1128997330, "dataset_size": 2878877988}} | 2023-12-05T08:09:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "thai_sample_500k"
More Information needed | [
"# Dataset Card for \"thai_sample_500k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"thai_sample_500k\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"thai_sample_500k\"\n\nMore Information needed"
] |
e9d5152c4446d984e71ab960f7844cbd6575da6b | # Dataset Card for "kor_quail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{DBLP:conf/aaai/RogersKDR20,
author = {Anna Rogers and
Olga Kovaleva and
Matthew Downey and
Anna Rumshisky},
title = {Getting Closer to {AI} Complete Question Answering: {A} Set of Prerequisite
Real Tasks},
booktitle = {The Thirty-Fourth {AAAI} Conference on Artificial Intelligence, {AAAI}
2020, The Thirty-Second Innovative Applications of Artificial Intelligence
Conference, {IAAI} 2020, The Tenth {AAAI} Symposium on Educational
Advances in Artificial Intelligence, {EAAI} 2020, New York, NY, USA,
February 7-12, 2020},
pages = {8722--8731},
publisher = {{AAAI} Press},
year = {2020},
url = {https://aaai.org/ojs/index.php/AAAI/article/view/6398},
timestamp = {Thu, 04 Jun 2020 13:18:48 +0200},
biburl = {https://dblp.org/rec/conf/aaai/RogersKDR20.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | KETI-AIR/kor_quail | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-12-05T08:11:03+00:00 | {"license": "cc-by-nc-sa-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "challenge", "path": "data/challenge-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "id", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "question_id", "dtype": "string"}, {"name": "domain", "dtype": "string"}, {"name": "metadata", "struct": [{"name": "author", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "url", "dtype": "string"}]}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_type", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "correct_answer_id", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 27612173, "num_examples": 10246}, {"name": "validation", "num_bytes": 5860893, "num_examples": 2164}, {"name": "challenge", "num_bytes": 1451663, "num_examples": 556}], "download_size": 2671154, "dataset_size": 34924729}} | 2023-12-05T08:19:44+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-4.0 #region-us
| # Dataset Card for "kor_quail"
More Information needed
# Source Data Citation Information
| [
"# Dataset Card for \"kor_quail\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n",
"# Dataset Card for \"kor_quail\"\n\nMore Information needed",
"# Source Data Citation Information"
] | [
19,
14,
6
] | [
"passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n# Dataset Card for \"kor_quail\"\n\nMore Information needed# Source Data Citation Information"
] |
f26b00ef528914415cbc2013180ce52ec0831fc2 |
Converting newsqa dataset to identical format as lmqg/qag_squad for asahi417/lm-question-generation
[GitHub Repo](https://github.com/gabrieltorresgamez/newsqa) | StellarMilk/newsqa | [
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | 2023-12-05T09:01:30+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "newsqa_train.parquet"}, {"split": "validation", "path": "newsqa_validation.parquet"}, {"split": "test", "path": "newsqa_test.parquet"}]}]} | 2023-12-05T09:43:41+00:00 | [] | [
"en"
] | TAGS
#size_categories-10K<n<100K #language-English #region-us
|
Converting newsqa dataset to identical format as lmqg/qag_squad for asahi417/lm-question-generation
GitHub Repo | [] | [
"TAGS\n#size_categories-10K<n<100K #language-English #region-us \n"
] | [
22
] | [
"passage: TAGS\n#size_categories-10K<n<100K #language-English #region-us \n"
] |
ca1ec6f34863b1afcfe3bd68fd4b41875f3b59c0 |
# Dataset Card for Evaluation run of DiscoResearch/DiscoLM-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DiscoResearch/DiscoLM-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DiscoResearch/DiscoLM-70b](https://huggingface.co/DiscoResearch/DiscoLM-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DiscoResearch__DiscoLM-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T09:06:38.645783](https://huggingface.co/datasets/open-llm-leaderboard/details_DiscoResearch__DiscoLM-70b/blob/main/results_2023-12-05T09-06-38.645783.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6867356803337994,
"acc_stderr": 0.030796614406114163,
"acc_norm": 0.6887936208915021,
"acc_norm_stderr": 0.03141252742960891,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.576426250198526,
"mc2_stderr": 0.015041628962992867
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620453,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.6768571997610038,
"acc_stderr": 0.004667209383690245,
"acc_norm": 0.8609838677554272,
"acc_norm_stderr": 0.0034525630964691296
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708045,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.03115852213135779,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.03115852213135779
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531006,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823074,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498312,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498312
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7689075630252101,
"acc_stderr": 0.027381406927868886,
"acc_norm": 0.7689075630252101,
"acc_norm_stderr": 0.027381406927868886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8752293577981651,
"acc_stderr": 0.01416829835915634,
"acc_norm": 0.8752293577981651,
"acc_norm_stderr": 0.01416829835915634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631002,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631002
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628124,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628124
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.01237878610188515,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.01237878610188515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7652733118971061,
"acc_stderr": 0.02407180588767704,
"acc_norm": 0.7652733118971061,
"acc_norm_stderr": 0.02407180588767704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815896,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815896
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5449804432855281,
"acc_stderr": 0.012718456618701782,
"acc_norm": 0.5449804432855281,
"acc_norm_stderr": 0.012718456618701782
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.576426250198526,
"mc2_stderr": 0.015041628962992867
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222795
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662245
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | open-llm-leaderboard/details_DiscoResearch__DiscoLM-70b | [
"region:us"
] | 2023-12-05T09:09:38+00:00 | {"pretty_name": "Evaluation run of DiscoResearch/DiscoLM-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [DiscoResearch/DiscoLM-70b](https://huggingface.co/DiscoResearch/DiscoLM-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DiscoResearch__DiscoLM-70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-05T09:06:38.645783](https://huggingface.co/datasets/open-llm-leaderboard/details_DiscoResearch__DiscoLM-70b/blob/main/results_2023-12-05T09-06-38.645783.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6867356803337994,\n \"acc_stderr\": 0.030796614406114163,\n \"acc_norm\": 0.6887936208915021,\n \"acc_norm_stderr\": 0.03141252742960891,\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.576426250198526,\n \"mc2_stderr\": 0.015041628962992867\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620453,\n \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6768571997610038,\n \"acc_stderr\": 0.004667209383690245,\n \"acc_norm\": 0.8609838677554272,\n \"acc_norm_stderr\": 0.0034525630964691296\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708045,\n \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708045\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.03115852213135779,\n \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.03115852213135779\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531006,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531006\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823074,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498312,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498312\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7689075630252101,\n \"acc_stderr\": 0.027381406927868886,\n \"acc_norm\": 0.7689075630252101,\n \"acc_norm_stderr\": 0.027381406927868886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8752293577981651,\n \"acc_stderr\": 0.01416829835915634,\n \"acc_norm\": 0.8752293577981651,\n \"acc_norm_stderr\": 0.01416829835915634\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631002,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631002\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628124,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628124\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822583,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822583\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n \"acc_stderr\": 0.01237878610188515,\n \"acc_norm\": 0.8607918263090677,\n \"acc_norm_stderr\": 0.01237878610188515\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967558,\n \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967558\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7652733118971061,\n \"acc_stderr\": 0.02407180588767704,\n \"acc_norm\": 0.7652733118971061,\n \"acc_norm_stderr\": 0.02407180588767704\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815896,\n \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815896\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5449804432855281,\n \"acc_stderr\": 0.012718456618701782,\n \"acc_norm\": 0.5449804432855281,\n \"acc_norm_stderr\": 0.012718456618701782\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.576426250198526,\n \"mc2_stderr\": 0.015041628962992867\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222795\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \"acc_stderr\": 0.013258428375662245\n }\n}\n```", "repo_url": "https://huggingface.co/DiscoResearch/DiscoLM-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "clementine@hf.co", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|arc:challenge|25_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|gsm8k|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hellaswag|10_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-05T09-06-38.645783.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["**/details_harness|winogrande|5_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-05T09-06-38.645783.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_05T09_06_38.645783", "path": ["results_2023-12-05T09-06-38.645783.parquet"]}, {"split": "latest", "path": ["results_2023-12-05T09-06-38.645783.parquet"]}]}]} | 2023-12-05T09:10:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DiscoResearch/DiscoLM-70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model DiscoResearch/DiscoLM-70b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-05T09:06:38.645783(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
| [
"# Dataset Card for Evaluation run of DiscoResearch/DiscoLM-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DiscoResearch/DiscoLM-70b on ... | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DiscoResearch/DiscoLM-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Disc... | [
6,
19,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DiscoResearch/DiscoLM-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DiscoResearch... |
9e8bcf16206e744ea76b1c5271a908fb4bd3a45a | # Dataset Card for "vietnamese-news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tiennv/vietnamese-news | [
"region:us"
] | 2023-12-05T09:14:46+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28837412151, "num_examples": 12573213}], "download_size": 15141327938, "dataset_size": 28837412151}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T09:45:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vietnamese-news"
More Information needed | [
"# Dataset Card for \"vietnamese-news\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vietnamese-news\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vietnamese-news\"\n\nMore Information needed"
] |
23e0633de11f1de0585274c874953f10c85bfcd2 | # Dataset Card for "english-wiki-corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tiennv/english-wiki-corpus | [
"region:us"
] | 2023-12-05T09:46:42+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8275936982, "num_examples": 10686170}], "download_size": 1407476006, "dataset_size": 8275936982}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T09:49:56+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "english-wiki-corpus"
More Information needed | [
"# Dataset Card for \"english-wiki-corpus\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"english-wiki-corpus\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"english-wiki-corpus\"\n\nMore Information needed"
] |
6ba82933eaf39b0defbcefe2cf4b9e01f5ad60eb | # Dataset Card for "english-mc4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tiennv/english-mc4 | [
"region:us"
] | 2023-12-05T09:55:22+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24653765251, "num_examples": 14294240}], "download_size": 15068999152, "dataset_size": 24653765251}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T10:26:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "english-mc4"
More Information needed | [
"# Dataset Card for \"english-mc4\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"english-mc4\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"english-mc4\"\n\nMore Information needed"
] |
56080b4b6f77c39b0a57147b1025feaabd1d843e | # Dataset Card for RuSRL
## Dataset Summary
This dataset contains annotations of semantic frames and intra-frame syntax for 1500 Russian sentences.
### Dataset Description
Each sentence is annotated with predicate-argument structures. Syntactic information is also provided for each frame.
```
{
"sent_id": 1404,
"tokens": ["в", "такой", "ситуации", "основные", "метеоэлементы",
"-", "температура", ",", "влажность", ",", "давление", "-",
"претерпевают", "малые", "суточные", "изменения", "."],
"synt_head": [12, 2, 0, 4, 12, -1, 4, -1, 6, -1, 8, -1, -1, 15, 15, 12, -1],
"sem_head": [-1, -1, -1, -1, 12, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 12, -1],
"sem_role": ["_", "_", "_", "_", "субъект", "_", "_", "_", "_", "_", "_", "_", "_", "_", "_", "предикат", "_"]
}
```
- **Language:** Russian
- **Size:** 1500 sentences
## Citation
```
@inproceedings{shelmanov2014methods,
title={Methods for semantic role labeling of Russian texts},
author={Shelmanov, AO and Smirnov, IV},
booktitle={Computational Linguistics and Intellectual Technologies: Papers from the Annual International Conference Dialogue},
volume={13},
number={20},
pages={607--620},
year={2014}
}
``` | IsaNLP/RuSRL | [
"task_categories:token-classification",
"annotations_creators:expert-generated",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"language:ru",
"license:cc-by-nc-4.0",
"semantic-role-labeling",
"syntax-parsing",
"tokenization",
"region:us"
] | 2023-12-05T09:58:39+00:00 | {"annotations_creators": ["expert-generated"], "language": ["ru"], "license": "cc-by-nc-4.0", "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "task_categories": ["token-classification"], "pretty_name": "RuSRL", "subtasks": ["semantic-role-labeling", "parsing"], "tags": ["semantic-role-labeling", "syntax-parsing", "tokenization"]} | 2023-12-05T11:01:32+00:00 | [] | [
"ru"
] | TAGS
#task_categories-token-classification #annotations_creators-expert-generated #multilinguality-monolingual #size_categories-1K<n<10K #language-Russian #license-cc-by-nc-4.0 #semantic-role-labeling #syntax-parsing #tokenization #region-us
| # Dataset Card for RuSRL
## Dataset Summary
This dataset contains annotations of semantic frames and intra-frame syntax for 1500 Russian sentences.
### Dataset Description
Each sentence is annotated with predicate-argument structures. Syntactic information is also provided for each frame.
- Language: Russian
- Size: 1500 sentences
| [
"# Dataset Card for RuSRL",
"## Dataset Summary\n\nThis dataset contains annotations of semantic frames and intra-frame syntax for 1500 Russian sentences.",
"### Dataset Description\n\nEach sentence is annotated with predicate-argument structures. Syntactic information is also provided for each frame.\n\n\n\n- ... | [
"TAGS\n#task_categories-token-classification #annotations_creators-expert-generated #multilinguality-monolingual #size_categories-1K<n<10K #language-Russian #license-cc-by-nc-4.0 #semantic-role-labeling #syntax-parsing #tokenization #region-us \n",
"# Dataset Card for RuSRL",
"## Dataset Summary\n\nThis dataset... | [
86,
8,
30,
41
] | [
"passage: TAGS\n#task_categories-token-classification #annotations_creators-expert-generated #multilinguality-monolingual #size_categories-1K<n<10K #language-Russian #license-cc-by-nc-4.0 #semantic-role-labeling #syntax-parsing #tokenization #region-us \n# Dataset Card for RuSRL## Dataset Summary\n\nThis dataset co... |
422ec5314cd50f70ec3f9a0b654fd8ef532584b2 | # Dataset Card for "kdd210_hourly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
**Download the Dataset**:
```python
from datasets import load_dataset
dataset = load_dataset("LeoTungAnh/kdd210_hourly")
```
**Dataset Card for Air Quality in KDD cup 2018**
Originally, the dataset is from KDD cup 2018, which consists of 270 time series data with different starting time. This dataset encompasses 210 hourly time series data points starting from 2017-01-01T14:00:00. The dataset reveals the air quality levels in 59 stations in 2 cities from 01/01/2017 to 31/03/2018.
**Preprocessing information**:
- Grouped by hour (frequency: "1H").
- Applied Standardization as preprocessing technique ("Std").
- Preprocessing steps:
1. Standardizing data.
2. Replacing NaN values with zeros.
**Dataset information**:
- Missing values are converted to zeros.
- Number of time series: 210
- Number of training samples: 10802
- Number of validation samples: 10850 (number_of_training_samples + 48)
- Number of testing samples: 10898 (number_of_validation_samples + 48)
**Dataset format**:
```python
Dataset({
features: ['start', 'target', 'feat_static_cat', 'feat_dynamic_real', 'item_id'],
num_rows: 210
})
```
**Data format for a sample**:
- 'start': datetime.datetime
- 'target': list of a time series data
- 'feat_static_cat': time series index
- 'feat_dynamic_real': None
- 'item_id': name of time series
**Data example**:
```python
{'start': datetime.datetime(2017, 1, 1, 14, 0, 0),
'feat_static_cat': [0],
'feat_dynamic_real': None,
'item_id': 'T1',
'target': [ 1.46812152, 1.31685537, 1.26169969, ..., 0.47487208, 0.80586637, 0.33006964]
}
```
**Usage**:
- The dataset can be used by available Transformer, Autoformer, Informer of Huggingface.
- Other algorithms can extract data directly by making use of 'target' feature. | LeoTungAnh/kdd210_hourly | [
"region:us"
] | 2023-12-05T10:19:13+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 18154839, "num_examples": 210}, {"name": "validation", "num_bytes": 18235479, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 47737715, "dataset_size": 54706437}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-06T00:51:46+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly"
More Information needed
Download the Dataset:
Dataset Card for Air Quality in KDD cup 2018
Originally, the dataset is from KDD cup 2018, which consists of 270 time series data with different starting time. This dataset encompasses 210 hourly time series data points starting from 2017-01-01T14:00:00. The dataset reveals the air quality levels in 59 stations in 2 cities from 01/01/2017 to 31/03/2018.
Preprocessing information:
- Grouped by hour (frequency: "1H").
- Applied Standardization as preprocessing technique ("Std").
- Preprocessing steps:
1. Standardizing data.
2. Replacing NaN values with zeros.
Dataset information:
- Missing values are converted to zeros.
- Number of time series: 210
- Number of training samples: 10802
- Number of validation samples: 10850 (number_of_training_samples + 48)
- Number of testing samples: 10898 (number_of_validation_samples + 48)
Dataset format:
Data format for a sample:
- 'start': datetime.datetime
- 'target': list of a time series data
- 'feat_static_cat': time series index
- 'feat_dynamic_real': None
- 'item_id': name of time series
Data example:
Usage:
- The dataset can be used by available Transformer, Autoformer, Informer of Huggingface.
- Other algorithms can extract data directly by making use of 'target' feature. | [
"# Dataset Card for \"kdd210_hourly\"\n\nMore Information needed\n\nDownload the Dataset:\n\n\nDataset Card for Air Quality in KDD cup 2018\n\nOriginally, the dataset is from KDD cup 2018, which consists of 270 time series data with different starting time. This dataset encompasses 210 hourly time series data point... | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly\"\n\nMore Information needed\n\nDownload the Dataset:\n\n\nDataset Card for Air Quality in KDD cup 2018\n\nOriginally, the dataset is from KDD cup 2018, which consists of 270 time series data with different starting time. This dataset encompasses 210 hour... | [
6,
353
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly\"\n\nMore Information needed\n\nDownload the Dataset:\n\n\nDataset Card for Air Quality in KDD cup 2018\n\nOriginally, the dataset is from KDD cup 2018, which consists of 270 time series data with different starting time. This dataset encompasses 210 h... |
cc85c24c863afb05b5a52f68950f9ba6b69975cf |
OpenViVQA: Open-domain Vietnamese Visual Question Answering
=====

The OpenViVQA dataset contains <b>11,000+</b> images with <b>37,000+</b> question-answer pairs which introduces the Text-based Open-ended Visual Question Answering in Vietnamese. This dataset is publicly available to the research community in the VLSP 2023 - ViVRC shared task challenge. You can access the dataset as well as submit your results to evaluate on the private test set on the [Codalab](https://codalab.lisn.upsaclay.fr/competitions/15212#participate) evaluation system.
Link to the OpenViVQA dataset:
- [Train images](train-images.zip) + [train annotations](vlsp2023_train_data.json).
- [Dev images](dev-images.zip) + [dev annotations](vlsp2023_dev_data.json).
- [Test images](test-images.zip) + [test annotations (without answers)](vlsp2023_test_data.json).
If you mention or use any information from our dataset, please cite our paper:
```
@article{NGUYEN2023101868,
title = {OpenViVQA: Task, dataset, and multimodal fusion models for visual question answering in Vietnamese},
journal = {Information Fusion},
volume = {100},
pages = {101868},
year = {2023},
issn = {1566-2535},
doi = {https://doi.org/10.1016/j.inffus.2023.101868},
url = {https://www.sciencedirect.com/science/article/pii/S1566253523001847},
author = {Nghia Hieu Nguyen and Duong T.D. Vo and Kiet {Van Nguyen} and Ngan Luu-Thuy Nguyen},
keywords = {Visual question answering, Vision-language understanding, Low-resource languages, Information fusion, Multimodal representation},
abstract = {In recent years, visual question answering (VQA) has attracted attention from the research community because of its highly potential applications (such as virtual assistance on intelligent cars, assistant devices for blind people, or information retrieval from document images using natural language as queries) and challenge. The VQA task requires methods that have the ability to fuse the information from questions and images to produce appropriate answers. Neural visual question answering models have achieved tremendous growth on large-scale datasets which are mostly for resource-rich languages such as English. However, available datasets narrow the VQA task as the answers selection task or answer classification task. We argue that this form of VQA is far from human ability and eliminates the challenge of the answering aspect in the VQA task by just selecting answers rather than generating them. In this paper, we introduce the OpenViVQA (Open-domain Vietnamese Visual Question Answering) dataset, the first large-scale dataset for VQA with open-ended answers in Vietnamese, consists of 11,000+ images associated with 37,000+ question–answer pairs (QAs). Moreover, we proposed FST, QuMLAG, and MLPAG which fuse information from images and questions, then use these fused features to construct answers as humans iteratively. Our proposed methods achieve results that are competitive with SOTA models such as SAAA, MCAN, LORA, and M4C. The dataset11https://github.com/hieunghia-pat/OpenViVQA-dataset. is available to encourage the research community to develop more generalized algorithms including transformers for low-resource languages such as Vietnamese.}
}
```
### Contact
This repository was constructed under the instruction of the [NLP@UIT Research Group](https://nlp.uit.edu.vn/). For more information, contact the following author:
1. Nghia Hieu Nguyen. Email: nghiangh@uit.edu.vn | uitnlp/OpenViVQA-dataset | [
"task_categories:visual-question-answering",
"size_categories:10K<n<100K",
"language:vi",
"license:mit",
"region:us"
] | 2023-12-05T10:52:34+00:00 | {"language": ["vi"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["visual-question-answering"]} | 2023-12-13T14:37:50+00:00 | [] | [
"vi"
] | TAGS
#task_categories-visual-question-answering #size_categories-10K<n<100K #language-Vietnamese #license-mit #region-us
|
OpenViVQA: Open-domain Vietnamese Visual Question Answering
=====
!examples
The OpenViVQA dataset contains <b>11,000+</b> images with <b>37,000+</b> question-answer pairs which introduces the Text-based Open-ended Visual Question Answering in Vietnamese. This dataset is publicly available to the research community in the VLSP 2023 - ViVRC shared task challenge. You can access the dataset as well as submit your results to evaluate on the private test set on the Codalab evaluation system.
Link to the OpenViVQA dataset:
- Train images + train annotations.
- Dev images + dev annotations.
- Test images + test annotations (without answers).
If you mention or use any information from our dataset, please cite our paper:
### Contact
This repository was constructed under the instruction of the NLP@UIT Research Group. For more information, contact the following author:
1. Nghia Hieu Nguyen. Email: nghiangh@URL | [
"### Contact\n\nThis repository was constructed under the instruction of the NLP@UIT Research Group. For more information, contact the following author:\n1. Nghia Hieu Nguyen. Email: nghiangh@URL"
] | [
"TAGS\n#task_categories-visual-question-answering #size_categories-10K<n<100K #language-Vietnamese #license-mit #region-us \n",
"### Contact\n\nThis repository was constructed under the instruction of the NLP@UIT Research Group. For more information, contact the following author:\n1. Nghia Hieu Nguyen. Email: ngh... | [
45,
47
] | [
"passage: TAGS\n#task_categories-visual-question-answering #size_categories-10K<n<100K #language-Vietnamese #license-mit #region-us \n### Contact\n\nThis repository was constructed under the instruction of the NLP@UIT Research Group. For more information, contact the following author:\n1. Nghia Hieu Nguyen. Email: ... |
770076f077c4c5e298498fa32f804857f46d5134 |
# UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned)
This dataset represents a new iteration on top of [`argilla/ultrafeedback-binarized-preferences`](https://huggingface.co/argilla/ultrafeedback-binarized-preferences),
and is the **recommended and preferred dataset by Argilla to use from now on when fine-tuning on UltraFeedback**.
Read more about Argilla's approach towards UltraFeedback binarization at [`argilla/ultrafeedback-binarized-preferences/README.md`](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences/blob/main/README.md).
## Differences with `argilla/ultrafeedback-binarized-preferences`
Thanks to the recent issue identified by [AllenAI](https://huggingface.co/allenai) related to the TruthfulQA contamination within the
original UltraFeedback dataset due to some prompts being reused from the TruthfulQA dataset (used for benchmarking
in the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) from HuggingFace H4), we also decided
to follow AllenAI's advice and remove those from the UltraFeedback dataset that we binarized using a completely different approach, which
implied using the average of the preference ratings rather than the critique overall score, as
[`HuggingFaceH4/ultrafeedback_binarized`](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) did.
Besides that, we also saw that not only the rows with the `source=truthful_qa` were contamined (for obvious reasons), but also some
coming from ShareGPT, so we also removed those doing a left join with both subsets from the [`truthful_qa`](https://huggingface.co/datasets/truthful_qa) dataset.
Additionally, we also modified the formatting to be aligned with both [`HuggingFaceH4/ultrafeedback_binarized`](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized),
and [`allenai/ultrafeedback_binarized_cleaned`](https://huggingface.co/datasets/allenai/ultrafeedback_binarized_cleaned) in order to ease
the integration within the [`huggingface/alignment-handbook`](https://github.com/huggingface/alignment-handbook) so that the formatting is standardized.
## Reproduce
<a target="_blank" href="https://colab.research.google.com/drive/1XR9P1St4yTNY0tjti_tIjm-yzP5Bfqc0?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
To reproduce the data processing combining both our approach and the suggestions from HuggingFace H4 w.r.t. the formatting and the ones from AllenAI to
remove the TruthfulQA contamination, feel free to run the attached Colab Notebook or just view it at [`notebook.ipynb`](./notebook.ipynb) within this repository.
From Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as
ourselves, as well as the whole community, benefit a lot from open source and we also want to give back.
## Citation
If you find this dataset is useful in your work, please cite the original UltraFeedback dataset: https://huggingface.co/datasets/openbmb/UltraFeedback
Additionally, you may also want to cite our work with Notus 7B, which lead the curation of the UltraFeedback dataset:
```bibtex
@misc{notus2023,
author = {Alvaro Bartolome and Gabriel Martin and Daniel Vila},
title = {Notus},
year = {2023},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/argilla-io/notus}}
}
```
> Alphabetically ordered by last name due to equal contribution. | argilla/ultrafeedback-binarized-preferences-cleaned | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"dpo",
"preference",
"ultrafeedback",
"region:us"
] | 2023-12-05T11:07:34+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "UltraFeedback Binarized Preferences Cleaned", "dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen-rating", "dtype": "float64"}, {"name": "chosen-model", "dtype": "string"}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected-rating", "dtype": "float64"}, {"name": "rejected-model", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 284937773, "num_examples": 60917}], "download_size": 143257393, "dataset_size": 284937773}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["dpo", "preference", "ultrafeedback"]} | 2023-12-11T14:22:19+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #dpo #preference #ultrafeedback #region-us
|
# UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned)
This dataset represents a new iteration on top of 'argilla/ultrafeedback-binarized-preferences',
and is the recommended and preferred dataset by Argilla to use from now on when fine-tuning on UltraFeedback.
Read more about Argilla's approach towards UltraFeedback binarization at 'argilla/ultrafeedback-binarized-preferences/URL'.
## Differences with 'argilla/ultrafeedback-binarized-preferences'
Thanks to the recent issue identified by AllenAI related to the TruthfulQA contamination within the
original UltraFeedback dataset due to some prompts being reused from the TruthfulQA dataset (used for benchmarking
in the Open LLM Leaderboard from HuggingFace H4), we also decided
to follow AllenAI's advice and remove those from the UltraFeedback dataset that we binarized using a completely different approach, which
implied using the average of the preference ratings rather than the critique overall score, as
'HuggingFaceH4/ultrafeedback_binarized' did.
Besides that, we also saw that not only the rows with the 'source=truthful_qa' were contamined (for obvious reasons), but also some
coming from ShareGPT, so we also removed those doing a left join with both subsets from the 'truthful_qa' dataset.
Additionally, we also modified the formatting to be aligned with both 'HuggingFaceH4/ultrafeedback_binarized',
and 'allenai/ultrafeedback_binarized_cleaned' in order to ease
the integration within the 'huggingface/alignment-handbook' so that the formatting is standardized.
## Reproduce
<a target="_blank" href="URL
<img src="URL alt="Open In Colab"/>
</a>
To reproduce the data processing combining both our approach and the suggestions from HuggingFace H4 w.r.t. the formatting and the ones from AllenAI to
remove the TruthfulQA contamination, feel free to run the attached Colab Notebook or just view it at 'URL' within this repository.
From Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as
ourselves, as well as the whole community, benefit a lot from open source and we also want to give back.
If you find this dataset is useful in your work, please cite the original UltraFeedback dataset: URL
Additionally, you may also want to cite our work with Notus 7B, which lead the curation of the UltraFeedback dataset:
> Alphabetically ordered by last name due to equal contribution. | [
"# UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned)\n\nThis dataset represents a new iteration on top of 'argilla/ultrafeedback-binarized-preferences',\nand is the recommended and preferred dataset by Argilla to use from now on when fine-tuning on UltraFeedback.\n\nRead more about Argilla... | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #dpo #preference #ultrafeedback #region-us \n",
"# UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned)\n\nThis dataset represents a new iteration on top of 'argilla/ultrafeedback-binarized-pr... | [
49,
119,
303,
233
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #dpo #preference #ultrafeedback #region-us \n# UltraFeedback - Binarized using the Average of Preference Ratings (Cleaned)\n\nThis dataset represents a new iteration on top of 'argilla/ultrafeedback-binarized... |
2284756c74f7aba6e4a74f75d4b6d52b72d231a5 | Dataset to train model | abhijeet-ta/ads_title_generation | [
"region:us"
] | 2023-12-05T11:42:22+00:00 | {} | 2023-12-05T13:33:13+00:00 | [] | [] | TAGS
#region-us
| Dataset to train model | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
77ee41e19954ff8203f68919c11096b08593a028 | # Dataset Card for "1000_trees_extended_onlytrees"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | pin-lpt/1000_trees_extended_onlytrees | [
"region:us"
] | 2023-12-05T11:49:57+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23779232.0, "num_examples": 10}], "download_size": 23781715, "dataset_size": 23779232.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T15:28:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "1000_trees_extended_onlytrees"
More Information needed | [
"# Dataset Card for \"1000_trees_extended_onlytrees\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"1000_trees_extended_onlytrees\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"1000_trees_extended_onlytrees\"\n\nMore Information needed"
] |
173a4c9576528f085e2ef63263f6e010b81ff2d2 | # Dataset Card for "tmp_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | BobaZooba/tmp_dataset | [
"region:us"
] | 2023-12-05T12:33:52+00:00 | {"dataset_info": {"features": [{"name": "hello", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19, "num_examples": 2}], "download_size": 780, "dataset_size": 19}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T12:33:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "tmp_dataset"
More Information needed | [
"# Dataset Card for \"tmp_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"tmp_dataset\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"tmp_dataset\"\n\nMore Information needed"
] |
655ef66ef2be07f89aec61407f24c772802eb87d |
# EuroSAT RGB
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
EUROSAT RGB is the RGB version of the EUROSAT dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.
- **Paper:** https://arxiv.org/abs/1709.00029
- **Homepage:** https://github.com/phelber/EuroSAT
## Description
<!-- Provide a longer summary of what this dataset is. -->
The EuroSAT dataset is a comprehensive land cover classification dataset that focuses on images taken by the [ESA Sentinel-2 satellite](https://sentinel.esa.int/web/sentinel/missions/sentinel-2). It contains a total of 27,000 images, each with a resolution of 64x64 pixels. These images cover 10 distinct land cover classes and are collected from over 34 European countries.
The dataset is available in two versions: **RGB only** (this repo) and all 13 [Multispectral (MS) Sentinel-2 bands](https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-2-msi/resolutions/spatial). EuroSAT is considered a relatively easy dataset, with approximately 98.6% accuracy achievable using a ResNet-50 architecture.
- **Total Number of Images**: 27000
- **Bands**: 3 (RGB)
- **Image Resolution**: 64x64m
- **Land Cover Classes**: 10
- Classes: Annual Crop, Forest, Herbaceous Vegetation, Highway, Industrial Buildings, Pasture, Permanent Crop, Residential Buildings, River, SeaLake
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/EuroSAT_RGB")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
EuroSAT_RGB = load_dataset("blanchon/EuroSAT_RGB")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{helber2017eurosat,
title={EuroSAT: A Novel Dataset and Deep Learning Benchmark for Land Use and Land Cover Classification},
author={Helber, et al.},
journal={ArXiv preprint arXiv:1709.00029},
year={2017}
}
```
| blanchon/EuroSAT_RGB | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"language:en",
"license:unknown",
"remote-sensing",
"earth-observation",
"geospatial",
"satellite-imagery",
"land-cover-classification",
"sentinel-2",
"arxiv:1709.00029",
"region:us"
] | 2023-12-05T12:56:11+00:00 | {"language": "en", "license": "unknown", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "paperswithcode_id": "eurosat", "pretty_name": "EuroSAT RGB", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "land-cover-classification", "sentinel-2"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Annual Crop", "1": "Forest", "2": "Herbaceous Vegetation", "3": "Highway", "4": "Industrial Buildings", "5": "Pasture", "6": "Permanent Crop", "7": "Residential Buildings", "8": "River", "9": "SeaLake"}}}}, {"name": "filename", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 104485303.0, "num_examples": 16200}, {"name": "test", "num_bytes": 34726245.0, "num_examples": 5400}, {"name": "validation", "num_bytes": 34781690.0, "num_examples": 5400}], "download_size": 174279561, "dataset_size": 173993238.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-12-05T13:02:42+00:00 | [
"1709.00029"
] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #sentinel-2 #arxiv-1709.00029 #region-us
|
# EuroSAT RGB
!EuroSAT RGB
EUROSAT RGB is the RGB version of the EUROSAT dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.
- Paper: URL
- Homepage: URL
## Description
The EuroSAT dataset is a comprehensive land cover classification dataset that focuses on images taken by the ESA Sentinel-2 satellite. It contains a total of 27,000 images, each with a resolution of 64x64 pixels. These images cover 10 distinct land cover classes and are collected from over 34 European countries.
The dataset is available in two versions: RGB only (this repo) and all 13 Multispectral (MS) Sentinel-2 bands. EuroSAT is considered a relatively easy dataset, with approximately 98.6% accuracy achievable using a ResNet-50 architecture.
- Total Number of Images: 27000
- Bands: 3 (RGB)
- Image Resolution: 64x64m
- Land Cover Classes: 10
- Classes: Annual Crop, Forest, Herbaceous Vegetation, Highway, Industrial Buildings, Pasture, Permanent Crop, Residential Buildings, River, SeaLake
## Usage
To use this dataset, simply use 'datasets.load_dataset("blanchon/EuroSAT_RGB")'.
If you use the EuroSAT dataset in your research, please consider citing the following publication:
| [
"# EuroSAT RGB\n\n\n!EuroSAT RGB\n\n\nEUROSAT RGB is the RGB version of the EUROSAT dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.\n- Paper: URL\n- Homepage: URL",
"## Description\n\n\n\nThe EuroSAT dataset is a c... | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #sentinel-2 #arxiv-1709.00029 #region-us \n",
"# EuroSAT RGB\n\n\n!EuroSAT RGB\n\n\nEUROSAT RGB is the RGB versio... | [
83,
63,
207,
53
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #sentinel-2 #arxiv-1709.00029 #region-us \n# EuroSAT RGB\n\n\n!EuroSAT RGB\n\n\nEUROSAT RGB is the RGB ver... |
0bc543ea4de948e0522b46edbc57946a8fad8633 |
The Urban Sounds dataset consists of audio samples collected in Amsterdam in the period 2018 - 2020.
The datasamples were collected for a project to create a sensor to classify audio events, with the goal of tackling noise pollution in the city.
This 'urban sounds small' dataset is a small part of the dataset, used for testing and prototyping purposes.
More on the sensor can be found here: https://github.com/sensemakersamsterdam/OpenEars
| UrbanSounds/urban_sounds_small | [
"task_categories:audio-classification",
"size_categories:n<1K",
"language:nl",
"language:en",
"license:apache-2.0",
"audio event",
"noise pollution",
"urban",
"region:us"
] | 2023-12-05T13:06:58+00:00 | {"language": ["nl", "en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["audio-classification"], "tags": ["audio event", "noise pollution", "urban"]} | 2023-12-07T10:49:13+00:00 | [] | [
"nl",
"en"
] | TAGS
#task_categories-audio-classification #size_categories-n<1K #language-Dutch #language-English #license-apache-2.0 #audio event #noise pollution #urban #region-us
|
The Urban Sounds dataset consists of audio samples collected in Amsterdam in the period 2018 - 2020.
The datasamples were collected for a project to create a sensor to classify audio events, with the goal of tackling noise pollution in the city.
This 'urban sounds small' dataset is a small part of the dataset, used for testing and prototyping purposes.
More on the sensor can be found here: URL
| [] | [
"TAGS\n#task_categories-audio-classification #size_categories-n<1K #language-Dutch #language-English #license-apache-2.0 #audio event #noise pollution #urban #region-us \n"
] | [
56
] | [
"passage: TAGS\n#task_categories-audio-classification #size_categories-n<1K #language-Dutch #language-English #license-apache-2.0 #audio event #noise pollution #urban #region-us \n"
] |
fceda27a3c49aec8c9a9ecc674097a21d6b9b793 |
# EuroSAT MSI
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
EUROSAT is a classification dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.
- **Paper:** https://arxiv.org/abs/1709.00029
- **Homepage:** https://github.com/phelber/EuroSAT
## Description
<!-- Provide a longer summary of what this dataset is. -->
The EuroSAT dataset is a comprehensive land cover classification dataset that focuses on images taken by the [ESA Sentinel-2 satellite](https://sentinel.esa.int/web/sentinel/missions/sentinel-2). It contains a total of 27,000 images, each with a resolution of 64x64 pixels. These images cover 10 distinct land cover classes and are collected from over 34 European countries.
The dataset is available in two versions: RGB only and **all 13** (this repo) [Multispectral (MS) Sentinel-2 bands](https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-2-msi/resolutions/spatial). EuroSAT is considered a relatively easy dataset, with approximately 98.6% accuracy achievable using a ResNet-50 architecture.
- **Total Number of Images**: 27000
- **Bands**: 13 (MSI)
- **Image Resolution**: 64x64m
- **Land Cover Classes**: 10
- Classes: Annual Crop, Forest, Herbaceous Vegetation, Highway, Industrial Buildings, Pasture, Permanent Crop, Residential Buildings, River, SeaLake
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/EuroSAT_MSI")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
EuroSAT_MSI = load_dataset("blanchon/EuroSAT_MSI")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{helber2017eurosat,
title={EuroSAT: A Novel Dataset and Deep Learning Benchmark for Land Use and Land Cover Classification},
author={Helber, et al.},
journal={ArXiv preprint arXiv:1709.00029},
year={2017}
}
```
| blanchon/EuroSAT_MSI | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"language:en",
"license:unknown",
"remote-sensing",
"earth-observation",
"geospatial",
"satellite-imagery",
"land-cover-classification",
"multispectral",
"sentinel-2",
"arxiv:1709.00029",
"region:us"
] | 2023-12-05T13:15:45+00:00 | {"language": "en", "license": "unknown", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "paperswithcode_id": "eurosat", "pretty_name": "EuroSAT MSI", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "land-cover-classification", "multispectral", "sentinel-2"], "dataset_info": {"features": [{"name": "image", "dtype": {"array3_d": {"dtype": "uint16", "shape": [64, 64, 13]}}}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Annual Crop", "1": "Forest", "2": "Herbaceous Vegetation", "3": "Highway", "4": "Industrial Buildings", "5": "Pasture", "6": "Permanent Crop", "7": "Residential Buildings", "8": "River", "9": "SeaLake"}}}}, {"name": "filename", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1995359806, "num_examples": 16200}, {"name": "test", "num_bytes": 665119564, "num_examples": 5400}, {"name": "validation", "num_bytes": 665120060, "num_examples": 5400}], "download_size": 2379014584, "dataset_size": 3325599430}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2023-12-05T13:33:44+00:00 | [
"1709.00029"
] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #multispectral #sentinel-2 #arxiv-1709.00029 #region-us
|
# EuroSAT MSI
!EuroSAT MSI
EUROSAT is a classification dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.
- Paper: URL
- Homepage: URL
## Description
The EuroSAT dataset is a comprehensive land cover classification dataset that focuses on images taken by the ESA Sentinel-2 satellite. It contains a total of 27,000 images, each with a resolution of 64x64 pixels. These images cover 10 distinct land cover classes and are collected from over 34 European countries.
The dataset is available in two versions: RGB only and all 13 (this repo) Multispectral (MS) Sentinel-2 bands. EuroSAT is considered a relatively easy dataset, with approximately 98.6% accuracy achievable using a ResNet-50 architecture.
- Total Number of Images: 27000
- Bands: 13 (MSI)
- Image Resolution: 64x64m
- Land Cover Classes: 10
- Classes: Annual Crop, Forest, Herbaceous Vegetation, Highway, Industrial Buildings, Pasture, Permanent Crop, Residential Buildings, River, SeaLake
## Usage
To use this dataset, simply use 'datasets.load_dataset("blanchon/EuroSAT_MSI")'.
If you use the EuroSAT dataset in your research, please consider citing the following publication:
| [
"# EuroSAT MSI\n\n\n!EuroSAT MSI\n\n\nEUROSAT is a classification dataset based on Sentinel-2 satellite images covering 13 spectral bands and consisting of 10 classes with 27000 labeled and geo-referenced samples.\n- Paper: URL\n- Homepage: URL",
"## Description\n\n\n\nThe EuroSAT dataset is a comprehensive land ... | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #multispectral #sentinel-2 #arxiv-1709.00029 #region-us \n",
"# EuroSAT MSI\n\n\n!EuroSAT MSI\n\n\nEUROSAT is a c... | [
87,
58,
207,
53
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #land-cover-classification #multispectral #sentinel-2 #arxiv-1709.00029 #region-us \n# EuroSAT MSI\n\n\n!EuroSAT MSI\n\n\nEUROSAT is ... |
a00e8832402308b5399141cce9b030c39083b16a |
# ADVANCE
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
Audiovisual Aerial Scene Recognition Dataset (ADVANCE) is a comprehensive resource designed for audiovisual aerial scene recognition tasks. It consists of 5,075 pairs of geotagged audio recordings and high-resolution 512x512 RGB images extracted from FreeSound and Google Earth. These images are then labeled into 13 scene categories using OpenStreetMap.
- **Paper:** https://arxiv.org/abs/2005.08449
- **Homepage:** https://akchen.github.io/ADVANCE-DATASET/
## Description
<!-- Provide a longer summary of what this dataset is. -->
The **Audiovisual Aerial Scene Recognition Dataset** is a comprehensive resource designed for audiovisual aerial scene recognition tasks. It consists of 5,075 pairs of geotagged audio recordings and high-resolution 512x512 RGB images extracted from [FreeSound](https://freesound.org/browse/geotags/?c_lat=24&c_lon=20&z=2) and [Google Earth](https://earth.google.com/web/). These images are then labeled into 13 scene categories using OpenStreetMap
The dataset serves as a valuable benchmark for research and development in audiovisual aerial scene recognition, enabling researchers to explore cross-task transfer learning techniques and geotagged data analysis.
- **Total Number of Images**: 5075
- **Bands**: 3 (RGB)
- **Image Resolution**: 10mm
- **Image size**: 512x512
- **Land Cover Classes**: 13
- **Classes**: airport, beach, bridge, farmland, forest, grassland, harbour, lake, orchard, residential, sparse shrub land, sports land, train station
- **Source**: Sentinel-2
- **Dataset features**: 5,075 pairs of geotagged audio recordings and images, three spectral bands - RGB (512x512 px), 10-second audio recordings
- **Dataset format**:, images are three-channel jpgs, audio files are in wav format
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/ADVANCE")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
ADVANCE = load_dataset("blanchon/ADVANCE")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{hu2020crosstask,
title = {Cross-Task Transfer for Geotagged Audiovisual Aerial Scene Recognition},
author = {Di Hu and Xuhong Li and Lichao Mou and P. Jin and Dong Chen and L. Jing and Xiaoxiang Zhu and D. Dou},
journal = {European Conference on Computer Vision},
year = {2020},
doi = {10.1007/978-3-030-58586-0_5},
bibSource = {Semantic Scholar https://www.semanticscholar.org/paper/7fabb1ef96d2840834cfaf384408309bafc588d5}
}
```
| blanchon/ADVANCE | [
"task_categories:image-classification",
"size_categories:1K<n<10K",
"language:en",
"license:unknown",
"remote-sensing",
"earth-observation",
"geospatial",
"satellite-imagery",
"audiovisual-aerial-scene-recognition",
"sentinel-2",
"arxiv:2005.08449",
"region:us"
] | 2023-12-05T13:38:06+00:00 | {"language": "en", "license": "unknown", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "paperswithcode_id": "advance", "pretty_name": "ADVANCE", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "audiovisual-aerial-scene-recognition", "sentinel-2"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "audio", "dtype": "audio"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airport", "1": "beach", "2": "bridge", "3": "farmland", "4": "forest", "5": "grassland", "6": "harbour", "7": "lake", "8": "orchard", "9": "residential", "10": "sparse shrub land", "11": "sports land", "12": "train station"}}}}], "splits": [{"name": "train", "num_bytes": 6698580359.05, "num_examples": 5075}], "download_size": 6688165513, "dataset_size": 6698580359.05}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-05T14:14:32+00:00 | [
"2005.08449"
] | [
"en"
] | TAGS
#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #audiovisual-aerial-scene-recognition #sentinel-2 #arxiv-2005.08449 #region-us
|
# ADVANCE
!ADVANCE
Audiovisual Aerial Scene Recognition Dataset (ADVANCE) is a comprehensive resource designed for audiovisual aerial scene recognition tasks. It consists of 5,075 pairs of geotagged audio recordings and high-resolution 512x512 RGB images extracted from FreeSound and Google Earth. These images are then labeled into 13 scene categories using OpenStreetMap.
- Paper: URL
- Homepage: URL
## Description
The Audiovisual Aerial Scene Recognition Dataset is a comprehensive resource designed for audiovisual aerial scene recognition tasks. It consists of 5,075 pairs of geotagged audio recordings and high-resolution 512x512 RGB images extracted from FreeSound and Google Earth. These images are then labeled into 13 scene categories using OpenStreetMap
The dataset serves as a valuable benchmark for research and development in audiovisual aerial scene recognition, enabling researchers to explore cross-task transfer learning techniques and geotagged data analysis.
- Total Number of Images: 5075
- Bands: 3 (RGB)
- Image Resolution: 10mm
- Image size: 512x512
- Land Cover Classes: 13
- Classes: airport, beach, bridge, farmland, forest, grassland, harbour, lake, orchard, residential, sparse shrub land, sports land, train station
- Source: Sentinel-2
- Dataset features: 5,075 pairs of geotagged audio recordings and images, three spectral bands - RGB (512x512 px), 10-second audio recordings
- Dataset format:, images are three-channel jpgs, audio files are in wav format
## Usage
To use this dataset, simply use 'datasets.load_dataset("blanchon/ADVANCE")'.
If you use the EuroSAT dataset in your research, please consider citing the following publication:
| [
"# ADVANCE\n\n\n!ADVANCE\n\n\nAudiovisual Aerial Scene Recognition Dataset (ADVANCE) is a comprehensive resource designed for audiovisual aerial scene recognition tasks. It consists of 5,075 pairs of geotagged audio recordings and high-resolution 512x512 RGB images extracted from FreeSound and Google Earth. These i... | [
"TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #audiovisual-aerial-scene-recognition #sentinel-2 #arxiv-2005.08449 #region-us \n",
"# ADVANCE\n\n\n!ADVANCE\n\n\nAudiovisual Aerial Scene Rec... | [
91,
102,
277,
51
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #audiovisual-aerial-scene-recognition #sentinel-2 #arxiv-2005.08449 #region-us \n# ADVANCE\n\n\n!ADVANCE\n\n\nAudiovisual Aerial Scene ... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.